Page 3«..2345..1020..»

Category Archives: Quantum Computing

3 Quantum Computing Stocks to Turn $100000 Into $1 Million: June Edition – InvestorPlace

Posted: June 13, 2024 at 4:37 pm

Capitalize on the synergy between AI and quantum with these millionaire-maker quantum computing stocks

Quantum computing, with its unparalleled data processing speed, has the potential to usher in a new era in tech. Moreover, the synergy between AI and quantum computing will elevate millionaire-maker quantum computing stocks to new heights. The industry is likely to achieve these kinds of returns as a result of becoming a new critical technology at the center of data processing and connection.

Moreover, quantum tech is leaving traditional silicon-based systems in the dust. Beyond this, some of the most influential companies in the tech world are driving the industry, promising exciting opportunities for investors. However, backing the right horses in the race for quantum supremacy is important to maximize your upside potential.

That said, here are three millionaire-maker quantum computing stocks worth investing in for the long haul. Thats because the industry still operates on the fringes of science and technology, making it a long-term play for those looking for generous returns.

IonQ (NYSE:IONQ) is the top pure-play quantum computing stock, perhaps the most promising among its peers. It has made some impressive strides of late, achieving ion stability for an hour, a feat that far comfortably outpaces its competition. Its promise is reflected in its recent strong financial performance. It recently reported its first-quarter (Q1) results, where sales soared 77.2% on a year-over-year (YOY) basis to $7.6 million. Additionally, its loss of 19 cents per share beat expectations by six cents. For the full year, it expects sales between $37 million and $41 million, over 70% growth at the mid-point on a YOY basis. Moreover, the company recently partnered with Oak Ridge National Laboratory (ORNL) to leverage quantum technology to modernize the power grid. This stellar partnership, along with others, demonstrates IonQs ability to innovate and expand its applications, offering healthy long-term upside ahead for its investors.

Source: Shutterstock

Investing in quantum computing can be complicated and speculative at the same time. To simplify the process, the Defiance Quantum ETF (NYSEARCA:QTUM) works best, with it investing in AI stocks to provide a balanced cushion.

The QTUM ETF offers investors exposure to some of the leading global businesses in transformative technologies such as machine learning, quantum computing, and cloud platforms. It holds investments in 70 different stocks, with its top 10 holdings representing just 20% of its $252 million net assets. Hence, its holdings are highly diversified, with an expense ratio of just 0.40%. Some of the companies in its investment portfolio are MicroStrategy(NASDAQ:MSTR),Nvidia(NASDAQ:NVDA), andMKS Instruments(NASDAQ:MKSI) to name a few.

Moreover, QTUM stock has been a smashing success for its investors in the past five years, generating a total return of over 175%, 361% higher than the median of all ETFs. In the past year alone, its up 30% and is positioned for healthy long-term gains.

Source: The Art of Pics / Shutterstock.com

Microsoft (NASDAQ:MSFT), a tech giant, has tentacles in virtually every major tech vertical, and quantum computing is no different. The AI revolution took Microsofts business up a notch or two last year, and it is eyeing quantum computing as the next frontier. Its partnership with quantum computing pure-play Quantinuum could be a breakthrough for the entire sector. According to a recent statement from one of Microsofts executives, the company has made massive progress in reducing qubit error rates, which is critical for commercializing quantum technology. Its qubit-virtualization system applied to Quantinuums ion-trap hardware, led to more than 14,000 error-free experiments. The breakthrough will set the stage for Quantinuums Helios H-Series quantum computer by next year. Moreover, the collaboration between the two tech companies aims to go from 100 reliable logical qubits to a whopping 1,000 qubits. If these lofty plans come to fruition, I wont be surprised if MSFT stock goes on another monumental run like last year.

On thedate of publication, Muslim Farooque did not have (eitherdirectly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines

Muslim Farooque is a keen investor and an optimist at heart. A life-long gamer and tech enthusiast, he has a particular affinity for analyzing technology stocks. Muslim holds a bachelors of science degree in applied accounting from Oxford Brookes University.

Read the original here:

3 Quantum Computing Stocks to Turn $100000 Into $1 Million: June Edition - InvestorPlace

Posted in Quantum Computing | Comments Off on 3 Quantum Computing Stocks to Turn $100000 Into $1 Million: June Edition – InvestorPlace

Recurrent quantum embedding neural network and its application in vulnerability detection | Scientific Reports – Nature.com

Posted: at 4:37 pm

In this section, we first introduce the composition and principles of the important components of RQENN including the trainable encoding method based on parameterized binary index and recurrent cell. Then we introduce the RQENN-based classification model. Finally, we present the task flow of applying RQENN classification model to vulnerability detection.

Classical neural network models for processing NLP tasks first need to tokenize the text and build a word dictionary, according to which the text is converted into a digital index sequence of words. Each digital index corresponds to a one-hot vector, which are transformed into dense vectors by word embedding methods involved in the network training to obtain a more accurate vector representation. However, similar methods migrated to QNN do not work. Specifically, in the classical model, the one-hot vectors are sparse and orthogonal, which means that when word embedding is performed, each word gets only some of the weights from the embedding weight matrix (W) as the representation vector. This process can be viewed as using the one-hot vector as the key to query the corresponding value in the weight map (W), as shown in Fig.1a. Thus, in the case of random initialization of weights, the initial representation vectors of all words are uncorrelated, and they establish lexical connections as the training process proceeds. However, in the quantum model, due to the properties of quantum superposition and entanglement, the quantum state obtained from encoded words (e.g., (|{psi }_{1}rangle) in Fig.1b) is difficult to be sparse and orthogonal as the classical one-hot vectors. This implies that the initially encoded quantum state of each word has some kind of connection, and the use of this quantum state as the "key" inevitably leads to the "value" obtained from the query being related to all the elements in the unitary matrix, which contains various non-semantic connections. The use of this quantum state as the "key" inevitably leads to a query that yields a "value" that is related to all the elements of the Missy's orthogonal weight matrix, and the results obtained contain various non-semantic connections. This prevents QNN from learning the semantics of words through the quantum embedding method.

(a) The process of token 'NULL' being encoded into a quantum circuit. The (|{psi }_{1}rangle) obtained after rotation input layer is treated as a quantum one-hot vector. It further applies QEmbedding to obtain (|{psi }_{2}rangle) which can be treated as a quantum dense vector used as token representation. (b) Classical word embedding computation process. The one-hot vector is treated as a key to query value in the map of weights.

To address this problem, we propose a trainable encoding method based on parameterized binary index to encode code tokens as quantum state data and efficiently learn the semantics of the tokens. The specific steps are as follows:

Step I: Tokenize source code into tokens to create a dictionary and then tokens are mapped to numeric indexes.

Step II: Convert the numeric indexes from decimal to binary representation. For a dictionary containing (N) words, an index is represented by an (n=lceil lo{g}_{2}Nrceil) bits binary numbers.

Step III: Replace "0" and "1" in the binary number indexes with the trainable parameters "({theta }_{0})" and "({theta }_{1})", forming parameterized binary indexes.

Step IV: Encode parameterized binary index using (n=lceil lo{g}_{2}Nrceil) qubits. Each bit of the input is encoded to the corresponding qubit through an Ry rotation gate.

Step V: Add a trainable layer containing parameters to the quantum circuit as a quantum embedding (QEmbedding) implementation.

The Ry rotation input layer and the QEmbedding layer in the above steps together form the trainable encoding layer.

As a simple example, for the following source code training set:

$$left[ {text{``}{text{VAR1 }} = {text{ NULL''}}, , text{``}{text{VAR2 }} = {text{ NULL''}}, , text{``}{text{VAR1 }} = {text{ VAR2''}}} right],$$

we can build such a dictionary to map all code tokens to numeric indexes:

$${ text{`}= text{'} : , 0,text{`}{text{VAR1'}} :{ 1},text{`}{text{NULL'}} :{ 2},text{`}{text{VAR2'}} :{ 3}} .$$

These numeric indexes are further converted to parameterized binary indexes:

$${ text{`}= text{'} : , {theta }_{0}{theta }_{0},text{`}{text{VAR1'}} :{ {theta }_{0}{theta }_{1}},text{`}{text{NULL'}} :{{theta }_{1}{theta }_{0}},text{`}{text{VAR2'}} :{ {theta }_{1}{theta }_{1}}} .$$

Next, we determine the angles of the Ry gates and construct the quantum circuit based on the parameterized binary index of the word to be encoded. Taking encoding token 'NULL' as an example, as shown in Fig.1b, its corresponding indexes ({theta }_{1}{theta }_{0}) are encoded bit-by-bit to a circuit with 2 qubits as the Ry gates angles. Then a QEmbedding layer is further added to jointly form the quantum trainable encoding circuits.

As Eqs.(13) shown below. (|{psi }_{0}rangle) is the initial state. The quantum circuit encodes the index ({theta }_{1}{theta }_{0}) into the quantum state (|{psi }_{1}rangle) by rotation input layer. It is a ({2}^{n})-dimensional vector. It has (N) different cases, corresponding to (N) possible combinations of the input rotation layer parameters. The parameters "({theta }_{0})" and "({theta }_{1})" are involved in the training process of QNN to eliminate possible inherent connections, so that (|{psi }_{1}rangle) has the same function as the classic one-hot vector. The one-hot vector is a form transformed by symbols that is easy to use by the classic network model, and the obtained (|{psi }_{1}rangle) is a form transformed by symbols that is easy to use by QNN. This is the unique aspect of trainable encoding based on parameterized binary index and the key to improving model performance. Next, (|{psi }_{1}rangle) learns lexical connections between encoded words through a trainable QEmbedding layer ({U}_{qe}({{varvec{theta}}}_{qe})), which is similar to the classic word embedding principle. The obtained quantum state (|{psi }_{2}rangle) is described in Eq.(3), where ({U}_{qe}({{varvec{theta}}}_{qe})={[begin{array}{cccc}{{varvec{u}}}_{0}^{dagger}& {{varvec{u}}}_{1}^{dagger}& {{varvec{u}}}_{2}^{dagger}& {{varvec{u}}}_{3}^{dagger}end{array}]}^{dagger}). At this point, the ({2}^{n})-dimensional dense vectors corresponding to (|{psi }_{2}rangle) are the representations of the words, except that the words are converted from indexes to quantum-friendly quantum state representations instead of classical vector representations.

$$|{psi }_{0}rangle ={[begin{array}{cccc}{varepsilon }_{0}& {varepsilon }_{1}& {varepsilon }_{2}& {varepsilon }_{3}end{array}]}^{dagger}$$

(1)

$$left|{psi }_{1}right.rangle =left{begin{array}{c}Ryleft({theta }_{0}right)otimes Ryleft({theta }_{0}right)left|{psi }_{0}right.rangle ,,, index={theta }_{0}{theta }_{0}\ Ryleft({theta }_{0}right)otimes Ryleft({theta }_{1}right)left|{psi }_{0}right.rangle ,,, index={theta }_{0}{theta }_{1}\ Ryleft({theta }_{1}right)otimes Ryleft({theta }_{0}right)left|{psi }_{0}right.rangle ,,, index={theta }_{1}{theta }_{0}\ Ryleft({theta }_{1}right)otimes Ryleft({theta }_{1}right)left|{psi }_{0}right.rangle , ,, index={theta }_{1}{theta }_{1}end{array}right.$$

(2)

$$|{psi }_{2}rangle ={U}_{qe}({{varvec{theta}}}_{qe})|{psi }_{1}rangle ={[begin{array}{cccc}{{varvec{u}}}_{0}^{dagger}|{psi }_{1}rangle & {{varvec{u}}}_{1}^{dagger}|{psi }_{1}rangle & {{varvec{u}}}_{2}^{dagger}|{psi }_{1}rangle & {{varvec{u}}}_{3}^{dagger}|{psi }_{1}rangle end{array}]}^{dagger}$$

(3)

Compared with the trainable encoding method based on parameterized binary index, if the binary index obtained in Step II is used for encoding, the fixed angle of the rotation gate (0 or 1) will result in constant non-lexical connections between (|{psi }_{1}rangle) of different words. These connections are brought into training process of the quantum word embedding layer, possibly affecting the normal learning of lexical connections of the code tokens. In fact, it can also make the (N) quantum states (|{psi }_{1}rangle) orthogonal to each other like classical one-hot vectors by choosing a suitable fixed angle of the rotation gate, this method is called the "orthogonal method". It determines the specific angles "({theta }_{0})" and "({theta }_{1})" to be used for replacing the binary "0" and "1" before training. By respectively applying (N) different rotation layers with angles "({theta }_{0})" and "({theta }_{1})" on (N) independent quantum circuits, we can obtain (N) quantum states. We use the gradient descent algorithm to minimize the sum of the absolute values of the two-by-two inner products of these (N) quantum states under random initialization of the quantum initial states. This approach references the property of mutual orthogonality between one-hot vectors, which ultimately yields ({theta }_{0}=-frac{pi }{2}) and ({theta }_{1}=frac{pi }{2}), and encoding using this value will make (|{psi }_{1}rangle) as orthogonal as possible for different tokens. But there are also differences between QEmbedding and classical embedding. Each element of the weight matrix in classical embedding is a trainable parameter, while QEmbedding only controls the changes of the matrix through a small number of parameter-containing unitary gates. It cannot be proved that the always orthogonal (|{psi }_{1}rangle) is more helpful for learning (|{psi }_{2}rangle). Therefore, in this paper, we add "({theta }_{0})" and "({theta }_{1})" as trainable parameters to the learning process of RQENN, which is the reason for using parameterized binary indexes in Step III. We will show in the Results section the performance of the model when using trainable encoding based on binary index, orthogonality method, and parameterized binary index as data inputs, further demonstrating the effectiveness of the proposed methods.

The trainable encoding method defined in the above section is a crucial component in the construction of our recurrent quantum embedding neural network cell. Much like classical RNNs, we define such a cell that will be successively applied to the input presented to the network for capturing contextual connections in the code. More specifically, the cell is comprised of a trainable encoding stage and a working stage, which are used to learn the semantics of input tokens and memorize contextual dependencies, respectively. This cell is applied iteratively in RQENN, and its internal state is passed on to the next iteration of the network. RQENN cells at all time steps share the same trainable parameters.

Figure2 illustrates the RQENN cell, which learns the quantum word embedding of the current time step input ({{varvec{x}}}_{t}=({x}_{{t}_{0}},...,{x}_{{t}_{n}})) in the encoding stage and combines it with the cell input hidden state (|{psi }_{t-1}rangle) in the work stage to learn the mapping relation from this combined state to the cell output hidden state (|{psi }_{t}rangle). The equation for this process is as follow:

$$|{psi }_{t}rangle ={U}_{qnn}{U}_{qe}{U}_{in}({{varvec{x}}}_{t})|{psi }_{t-1}rangle$$

(4)

where ({U}_{in}), ({U}_{qe}) and ({U}_{qnn}) denote the unitary matrix of the rotation input layer, QEmbedding layer and quantum weight (QWeight) layer, respectively.

Recurrent quantum embedding neural network cell. It consists of a trainable encoding stage and a QNN work stage, where the principle of the encoding stage is as described in the above section. It transforms the internal state in into the state out at each time step and iterates this process.

The encoding stage uses the trainable encoding method described above. In the rotation input layer, an Ry gate is applied on the ith qubit to rotate the angle to the ith value of the parameterized binary index. In the QEmbedding layer, a (m) layer ansatz composed of alternate rotation layer and entanglement layer is used to learn the quantum word embedding representation. Each layer of the ansatz consists of 2 rotation layers and 2 entanglement layers consisting of staggered entanglements between adjacent qubits. In the working stage, a (n) layer one-dimensional alternating layered Hardware Efficient Ansatz48,49 was used to build the QWeight layer. This ansatz is implemented by sequentially applying a two-qubit unitary to adjacent qubits. Each two-qubit unitary entangles the last qubit obtained from a previous unitary with the next one. The unique recurrent circuit cell architecture with scalable layer in multi-stage is the key to improving model performance. This two-qubit unitary consists of two Ry gates and a Cnot gate that have been proven effective50, and its unitary transformation is described by Eq. (5). We show below the specific implementations of the different network layers by equations. Equation (6) shows the unitary transformation of the rotation input layer, where (tin {1,...,T}) represents the time step. A token is input into the network at each time step, and (T) is set as the total code length. Equations (7, 8) and Eq. (9) are the unitary transformations of the QEmbedding layer and QWeight layer respectively.

$${U}_{l,i}^{left[2right]}left({{varvec{theta}}}_{l,i}right)=Cno{t}_{i,i+1}{otimes }_{j=0}^{1}R{y}_{i+j}left({theta }_{l,i,j}right), lin left{text{0,1}right} and iin left{0,dots ,n-2right}$$

(5)

$${U}_{in}left({{varvec{x}}}_{t}right)={otimes }_{i=0}^{n}left(R{y}_{i}left({x}_{{t}_{i}}right)right), {x}_{{t}_{i}}in left{{theta }_{0},{theta }_{1}right} and tin {1,...,T}$$

(6)

$${U}_{q{e}_{l}}({{varvec{theta}}}_{q{e}_{l}})=prod_{i=1}^{lfloor (n-1)/2rfloor }Cno{t}_{2i-text{1,2}i}{otimes }_{i=0}^{n-1}R{y}_{i}({theta }_{l,n+i})prod_{i=1}^{lfloor n/2rfloor }Cno{t}_{2i-text{2,2}i-1}{otimes }_{i=0}^{n-1}R{y}_{i}({theta }_{l,i})$$

(7)

$${U}_{qe}left({{varvec{theta}}}_{qe}right)={U}_{q{e}_{1}}left({{varvec{theta}}}_{q{e}_{1}}right){U}_{q{e}_{0}}left({{varvec{theta}}}_{q{e}_{0}}right)$$

(8)

$${U}_{qnn}left({{varvec{theta}}}_{qnn}right)={U}_{1,n-2}^{left[2right]}left({{varvec{theta}}}_{1,n-2}right)dots {U}_{text{1,0}}^{left[2right]}left({{varvec{theta}}}_{text{1,0}}right){U}_{0,n-2}^{left[2right]}left({{varvec{theta}}}_{0,n-2}right)dots {U}_{text{0,0}}^{left[2right]}left({{varvec{theta}}}_{text{0,0}}right)$$

(9)

We use RQENN cell to build classifiers applied to vulnerability detection. Similar to RNN, RQENN initializes the hidden state at (t=0) by adding a layer of Hadamard gates initially, and then the RQENN cell is iteratively applied to a sequence of the input source code ({{varvec{x}}}_{1},{{varvec{x}}}_{2},...,{{varvec{x}}}_{T}) as shown in Fig.3 to capture the contextual connections in the source code. The entire model also includes measuring the expectation value of a single qubit for the last two qubits. This expectation value is described as Eq.(10):

$${E}_{i}left({varvec{X}},{varvec{Theta}}right)=langle {0}^{otimes n}|{H}^{daggerotimes n}{U}_{QC}^{dagger}left({varvec{X}},{varvec{Theta}}right){widehat{M}}_{i}{U}_{QC}({varvec{X}},{varvec{Theta}}){H}^{otimes n}|{0}^{otimes n}rangle ,hspace{0.5em}hspace{0.5em}iin {n-1,n-2}$$

(10)

where ({U}_{QC}({varvec{X}},{varvec{Theta}})={U}_{cell}({{varvec{x}}}_{1},{varvec{Theta}})...{U}_{cell}({{varvec{x}}}_{T},{varvec{Theta}})) is the a quantum circuit composed of all cells, and ({U}_{cell}({{varvec{x}}}_{t},{varvec{Theta}})={U}_{qnn}({{varvec{theta}}}_{qnn}){U}_{qe}({{varvec{theta}}}_{qe}){U}_{in}({{varvec{x}}}_{t},{{varvec{theta}}}_{in})). ({varvec{Theta}}) is the parameter set of the cell, and ({varvec{X}}=[{{{varvec{x}}}_{1},dots ,{varvec{x}}}_{T}]) is the input index sequence. ({widehat{M}}_{i}) is the operator used to calculate the expectation of the ith qubit, i.e.

RQENN classifier. The model is built by iteratively applying the same RQENN cell to the input code token sequence. Measurements are performed on the last two qubits separately to obtain the expectation values as classification logits.

$${widehat{M}}_{i}=left{begin{array}{c}Iotimes Iotimes ...otimes {sigma }_{z}otimes I,hspace{0.5em}i=n-2\ Iotimes Iotimes ...otimes Iotimes {sigma }_{z},hspace{0.5em}i=n-1end{array}right.$$

(11)

The two calculated expectation values are used to determine the data category by comparing the numerical magnitudes, and we use them as logits to calculate the cross-entropy loss function for classification.

The goal of our vulnerability detection is to detect whether a program's source code may contain vulnerabilities using the RQENN classifier. In this paper, we perform the vulnerability detection task using the pipeline shown in Fig. 4, which consists of the following three steps:

Vulnerability detection task flow. We extract normalized labeled code gadgets from the source code as training data and then generate parameterized binary indexes from them, which are fed into the RQENN classifier. After training, the model can detect the presence of vulnerabilities in the source code.

Step I: Generating normalized code gadgets and labels from source code. First, we extract the data dependency graph (DDG) of the code using the open source code analysis tool Joern. Next, we extract labeled code gadgets based on manually defined vulnerability features. Specifically, we locate the node containing the vulnerable library function/API call in the extracted DDG, such as the "strcat" function shown in left side of Fig.4, and slice the code into small pieces according to the connection to the node. The types of API calls are categorized into forward (e.g., the "recv" function) and backward API calls (e.g., the "strcat" function here) according to whether or not they take external input from a socket, and forward slices and backward slices are generated accordingly. The forward slices obtain the set of statements of the nodes in the DDG that are recursively pointed forward from the API node, and the backward slices obtain the set of statements of the nodes in the DDG that are recursively pointed to the API node. These slices are code gadgets, which are labeled '0' or '1' depending on whether they contain vulnerabilities or not. In the next step we normalized the code gadget. The processing methods include removing comments and strings, normalizing user-defined variable names ('VAR1' etc.) and function names ('FUN1' etc.). Finally, the normalized labeled code gadget is obtained.

Step II: The normalized labeled code gadgets are treated as text data from which parameterized binary index sequences are generated. First, we preprocess the data set, clean the original text, remove punctuation marks and non-ascii characters, etc. Then we split and pad the preprocessed text and build a dictionary, which is converted into a parameterized binary index dictionary according to the method mentioned before. Finally, the token sequences after tokenization are converted into parameterized binary index sequences according to the dictionary.

Step III: Training and evaluating RQENN Models. We input the sequence of parameterized binary indexes into the RQENN model in order, execute the quantum circuit on the simulator or a real machine, and complete the training and validation of the RQENN model according to the quantum circuit learning framework18. The model can detect the presence of vulnerabilities in the source code.

Read the rest here:

Recurrent quantum embedding neural network and its application in vulnerability detection | Scientific Reports - Nature.com

Posted in Quantum Computing | Comments Off on Recurrent quantum embedding neural network and its application in vulnerability detection | Scientific Reports – Nature.com

The 3 Best Quantum Computing Stocks to Buy in June 2024 – InvestorPlace

Posted: at 4:37 pm

Technology firms, both public and private, have been working hard to develop quantum computing technologies for decades. The reasons for that are straightforward. Quantum machines, which harness the quantum mechanics undergirding subatomic particles, have a number of advantages over classical computers. Portfolio optimization and climate predictive algorithms that improve with more complexity are better handled by quantum computers.

U.S. equities markets have surged with the rise of generative artificial intelligence (AI) and its potential to create enormous efficiencies and profits for firms across various industries. While AI has brought quantum computing back into the spotlight, a lack of practical ways to scale these complex products has severely dented the performance of pure-play quantum computing stocks, such as IonQ (NYSE:IONQ) and Rigetti Computing (NASDAQ:RGTI).

Fortunately, not every public company invested in quantum computing has seen doom and gloom. Below are the three best quantum computing stocks investors should buy in June.

Source: shutterstock.com/LCV

International Business Machines (NYSE:IBM) is a legacy American technology business. It has its hands in everything from cloud infrastructure, artificial intelligence, and technology consulting services to quantum computers.

The firm committed to developing quantum computing technologies in the early 2000s and tends to publish new findings in the burgeoning field frequently. In December 2023, IBM released a new quantum chip system, Quantum System Two, that leverages the firms Heron processor, which has 133 qubits. Qubits are analogous to bytes on a classical computer. But instead of being confined to states of 0s and 1s, qubits, by way of superposition, can assume both states at the same time.

Moreover, what makes Quantum System Two particularly innovative is its use of both quantum and classical computing technologies. In a press release, IBM states, It combines scalable cryogenic infrastructure and classical runtime servers with modular qubit control electronics. IBM believes the combination of quantum computation and communication with classical computing resources can create a scalable quantum machine.

IBMs innovations in quantum computing technologies as well as AI has not gone unnoticed either. Shares have risen 31.3% over the past 12 months. The computing giants relatively cheap valuation coupled with its exposure to novel, high-growth fields could boost the value of its shares in the long-term.

Source: sdx15 / Shutterstock.com

Investors have given Nvidia (NASDAQ:NVDA) attention and praise over the past 12 months due to its critical role in AI computing technologies. The chipmakers advanced GPUs, including the H100 and H200 processors, are some of the most coveted chips on the market. The new Blackwell chips, coming to the market in the second half of 2024, bring to the table even better performance.

Though Nvidias prowess in the world of AI captures much of the headlines, the firm has already made inroads into the next stage of computing. In 2023, Nvidia announced a new quantum system in conjunction with startup Quantum Machines. It leverages what Nvidia calls the Grace Hoper Super Chip (GH200) as well as the chipmaker advanced CUDA Quantum (CUDA-Q) developer software.

In 2024, Nvidia released its Quantum Cloud platform, which allows users to build and test quantum computing algorithms in the cloud. The chipmakers GPUs and its open-source CUDA platform will likely be essential to scaling up the quantum computing space.

Nvidias share price has surged 214.2% over the past 12 months.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Quantum computers are complex machines that require all kinds of components. Furthermore, it is vital for quantum systems to operate at extremely low temperatures in order to operate efficiently.

FormFactor (NASDAQ:FORM) specializes in developing cryogenic systems or systems that are meant to deal with low temperatures. Everything from wafer testing probes to low-vibration probe stations as well as sophisticated refrigerators call cryostats, FormFactor provides. Also, the firms analytical probe tools are useful for developing advanced chips, such as NAND flash memory.

With quantum computing systems and advanced memory chips in greater demand these days, FormFactor could see revenues and earnings rise in the near and medium terms. FormFactors share price has surged 77.5% over the past 12 months, underscoring that investors are taking notice of the companys long-term value.

At the beginning of May, FormFactor released first quarter results for fiscal year 2024 and topped revenue estimates while EPS came in line with market expectations. The firm expects strong demand for advanced memory chips, such as DRAM, will help propel revenue growth in the following quarters.

On the date of publication, Tyrik Torresdid not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Tyrik Torres has been studying and participating in financial markets since he was in college, and he has particular passion for helping people understand complex systems. His areas of expertise are semiconductor and enterprise software equities. He has work experience in both investing (public and private markets) and investment banking.

Visit link:

The 3 Best Quantum Computing Stocks to Buy in June 2024 - InvestorPlace

Posted in Quantum Computing | Comments Off on The 3 Best Quantum Computing Stocks to Buy in June 2024 – InvestorPlace

Chicago Trying to Lure PsiQuantum to Former Steel Plant – The Real Deal

Posted: at 4:37 pm

Mayor Brandon Johnson is throwing his weight behind the quantum computing movement thats gaining traction in Chicago.

The Johnson administration is working with county and state officials to create an inventive package for PsiQuantum to redevelop the former U.S. Steel South Works on the citys South Side into a state-of-the-art quantum computing facility, Crains reported.

The effort is to persuade California-based PsiQuantum, a pioneer in quantum computing, to choose the former U.S. Steel site over the former Texaco refinery in southwest suburban Lockport. Also vying to host the facility, Lockport officials cite its access to significant water and electric power resources essential for supercomputers cooling requirements. Plus, the Lockport site is being offered for free and has been environmentally remediated.

Chicago may need to invest up to $150 million to match Lockports proposal, the outlet reported, citing an anonymous source. Parts of the South Works site are environmentally cleared.

The development, expected to involve billions of dollars in investment, would help revitalize a South Side community thats struggled since the closure of local steel mills. PsiQuantum is expected to make its decision within the next month.

The proposed incentive package further demonstrates Gov. J.B. Pritzkers desire to position Illinois as a hub for quantum development. He views this cutting-edge technology as a long-term driver of high-paying jobs, attracting researchers and skilled workers.

Pritzker envisions the PsiQuantum facility as the cornerstone of a $20 billion quantum research campus. The campus is intended to attract global companies eager to harness the technologys advanced data processing capabilities for artificial intelligence and other advancements.

The Illinois General Assembly has approved up to $500 million in funding for site preparation and incentives. While details of Chicagos incentive package havent been disclosed, officials said it is highly competitive.

The citys offer could include changes to the zoning code to accommodate the unique requirements of a computer-related development such as a data center, along with potential tax-increment financing districts. In addition, Cook County is considering a Class 8 property tax break, which significantly reduces assessments for industrial developments.

Quinn Donoghue

PsiQuantum, Related eye big industrial sites for redevelopment

IBMs quantum computing moves could put industrial space in demand

Blackstones next play: Data centers for AI boom

Read the original:

Chicago Trying to Lure PsiQuantum to Former Steel Plant - The Real Deal

Posted in Quantum Computing | Comments Off on Chicago Trying to Lure PsiQuantum to Former Steel Plant – The Real Deal

Quantum computer photons create a vortex when they collide – Earth.com

Posted: at 4:37 pm

Scientists have stumbled upon a remarkable discovery that challenges our understanding of the quantum world. New research revealed the existence of a previously unknown type of vortex that emerges when photons, the elusive particles of light, engage in a mesmerizing dance of interaction.

The implications of this finding extend far beyond the realm of pure science, holding the potential to revolutionize the field of quantum computing.

The research team, led by a brilliant quartet of scientists Dr. Lee Drori, Dr. Bankim Chandra Das, Tomer Danino Zohar, and Dr. Gal Winer embarked on this journey of discovery in the hallowed halls of Prof. Ofer Firstenbergs laboratory at the Weizmann Institute of Sciences Physics of Complex Systems Department.

Their initial goal was to explore efficient ways of harnessing the power of photons for data processing in quantum computers.

Little did they know that their quest would lead them down an unexpected path, into a world where the rules of classical physics are bent and the secrets of the quantum realm are laid bare.

Photons, the fundamental particles of light, are known for their wave-like behavior. However, getting them to interact with each other is no easy feat. It requires the presence of matter that acts as an intermediary.

To create the perfect environment for photon interactions, the researchers designed a unique setup: a 10-centimeter glass cell containing a dense cloud of rubidium atoms, tightly packed in the center.

As photons passed through this cloud, the researchers closely examined their state to see if they had influenced one another.

When the photons pass through the dense gas cloud, they send a number of atoms into electronically excited states known as Rydberg states, Prof. Firstenberg explains.

He goes on to describe how, in these Rydberg states, a single electron within the atom begins to orbit at an astonishing distance, up to 1,000 times the diameter of an unexcited atom.

This electron, with its vastly expanded orbit, generates an electric field so powerful that it envelops and influences countless neighboring atoms, effectively transforming them into what Prof. Firstenberg poetically refers to as an imaginary glass ball.'

As the researchers delved deeper into the interactions between photons, they stumbled upon something extraordinary.

When two photons passed relatively close to each other, they moved at a different speed than they would have if each had been traveling alone. This change in speed altered the positions of the peaks and valleys of the waves they carried.

In the ideal scenario for quantum computing applications, the positions of the peaks and valleys would become completely inverted relative to one another, a phenomenon known as a 180-degree phase shift. However, what the researchers observed was even more fascinating.

When the gas cloud was at its densest and the photons were in close proximity, they exerted the highest level of mutual influence.

But as the photons moved away from each other or the atomic density around them decreased, the phase shift weakened and disappeared.

Instead of a gradual process, the researchers were surprised to find that a pair of vortices developed when two photons were a certain distance apart.

To visualize photon vortices, imagine dragging a vertically held plate through water. The rapid movement of the water pushed by the plate meets the slower movement around it, creating two vortices that appear to be moving together along the waters surface.

In reality, these vortices are part of a three-dimensional configuration called a vortex ring.

The researchers discovered that the two vortices observed when measuring two photons are part of a three-dimensional vortex ring generated by the mutual influence of three photons.

These findings showcase the striking similarities between the newly discovered vortices and those found in other environments, such as smoke rings.

While the discovery of photon vortices has taken center stage, the researchers remain dedicated to their original goal of advancing quantum data processing.

The next phase of their study will involve firing photons into each other and measuring the phase shift of each photon separately.

The strength of these phase shifts could determine the potential for photons to be used as qubits, the basic units of information in quantum computing.

Unlike regular computer memory units, which can only be 0 or 1, quantum bits have the ability to represent a range of values between 0 and 1 simultaneously.

The prevalent assumption was that this weakening would be a gradual process, but researchers were in for a surprise, Dr. Eilon Poem and Dr. Alexander Poddubny, key contributors to the study, reveal.

They go on to describe the astonishing discovery that when two photons reached a specific distance from each other, a pair of vortices spontaneously emerged.

These vortices, characterized by a complete 360-degree phase shift of the photons, featured a peculiar void at their center, eerily reminiscent of the dark, calm eye found at the heart of other well-known vortices in nature.

The journey that led to this discovery spanned eight years and saw two generations of doctoral students pass through Prof. Firstenbergs laboratory.

Over time, the Weizmann scientists successfully created a dense, ultracold gas cloud packed with atoms, enabling them to achieve the unprecedented: photons that underwent a phase shift of 180 degrees or more.

As the research team continues to unravel the mysteries of photon interactions and their potential applications in quantum computing, one thing is certain: their findings have opened up a new realm of possibilities in the world of physics and beyond.

The full study was published in the journal Science.

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

Continue reading here:

Quantum computer photons create a vortex when they collide - Earth.com

Posted in Quantum Computing | Comments Off on Quantum computer photons create a vortex when they collide – Earth.com

Quantum Computers May Break Bitcoin by 2030, But We Won’t Know About It – Cryptonews

Posted: at 4:37 pm

Last updated: June 13, 2024 09:00 EDT | 11 min read

Quantum computers might sound like another buzzword in the tech world, yet their threat to cryptocurrency is very real and approaching fast. Scientists may differ on the timeline, but they all agree: Q-day is not a matter of if, but when.

Weve spoken to quantum experts around the world to hear the latest estimates on when it will happen, what can be done to protect cryptocurrency, and whether these powerful machines could somehow benefit the crypto world.

Unlike traditional computers, which use bits as the smallest unit of data, each bit being a 1 or a 0, quantum computers use quantum bits, or qubits. These qubits can exist in 0 and 1 states or in multiple states at oncea property called superposition.

This allows quantum computers to perform calculations simultaneously and process large amounts of data much faster than standard computers.

As quantum computers can hold and process many possible outcomes at once, it reduces the time needed to solve problems that depend on trying many different solutions, such as factoring large numbers, which is the foundation of most cryptocurrency encryption.

Factoring large numbers, or integer factorization, is a mathematical process of breaking down a large number into smaller, simpler numbers called factors, which, when multiplied together, result in the original number. The process is called prime factorization if these integers are further restricted to prime numbers.

In cryptocurrency, security heavily relies on the mathematical relationship between private and public keys. A public key is a long string of characters associated with the wallet address. It can be shared openly. A private key, used to sign transactions, must remain confidential. This mathematical relationship is one-way, meaning that a public key can be derived from the private key but not the other way around.

Itan Barmes, who is the Global quantum cyber readiness capability lead at Deloitte, explained in a conversation with Cryptonews:

The quantum computer breaks this one-way relationship between the two. So, if you have someones public key, you can calculate their private key, impersonate them, transfer their funds elsewhere.

The task is currently nearly impossible for conventional computers. However, in 1999, mathematician Peter Shor showed that a quantum computer could solve the factoring problem much faster. Shors algorithm can also solve the Discrete Logarithm Problem, which is the basis for the security of most blockchains. This means if such a powerful quantum computer existed, it could break the cryptocurrency security model.

Not all cryptocurrencies would face the same level of risk from quantum attacks. In 2020, Itan Barmes and a team of Deloitte researchers examined the entire Bitcoin blockchain to determine how many coins were vulnerable. They discovered that about 25% of Bitcoins could be at risk.

Pay To Public Key (P2PK)

Pay to Pubkey Hash (P2PKH)

These addresses directly use the public key, making them visible and vulnerable to quantum attacks.

These addresses use a cryptographic hash of the public key. They dont expose the public key directly until coins are moved.

Vulnerable coins include those held in P2PK (Pay To Public Key) addresses, which directly expose the public key, making them easy targets for a quantum attack. Coins in reused P2PKH (Pay to Pubkey Hash) addresses are also at risk because these addresses display their public key when the owner moves the funds. This attack is called the storage attack, as it applies to coins residing in static addresses. Itan Barmes further explained:

A quantum attack only applies to specific coins, not everything. If we conducted the same research today, the percentage of vulnerable coins would be lower because the number of vulnerable addresses remains more or less the same, but due to mining, there are more coins in circulation.

Itan Barmes added that in addition to the storage attack, there is also an attack on active transactions, as the public key is exposed for the first time.

Such an attack must be performed within the mining time (for Bitcoin, around 10 minutes), which adds a requirement for the quantum computer to not only be powerful enough but also fast. This so-called transit attack is likely to be possible later than the storage attack due to this additional requirement.

Ideally, Bitcoin users must generate a new address for each transaction. Yet, recent research by Bitmex suggests that about 50% of transaction outputs still go to previously used addresses, which means the practice of address reuse is more common in Bitcoin transactions than we may think.

Are we nearing the point where quantum computers can pose a real threat? In 2017, a group of researchers, including Divesh Aggarwal and Gavin Brennen, published an article warning that the elliptic curve signature scheme used by Bitcoin could be completely broken by a quantum computer as early as 2027, by the most optimistic estimates.

Cryptonews reached out to the authors to ask whether their estimation has shifted. Gavin Brennen from Macquarie University in Australia replied that although a lot has changed in quantum computing space since then, the basic message is still the same:

Quantum computers pose a threat to blockchains, primarily by attacks on digital signatures, and cryptocurrencies should get started sooner rather than later to upgrade their systems to use post-quantum cryptography before their asset valuations are threatened.

To be able to break cryptocurrency security, quantum computers will likely need thousands, if not millions, of qubits. Currently, the most advanced machines have around 1000.

Another critical challenge is error reduction. Quantum bits are highly sensitive to their environment; even the slightest disturbance, like a change in temperature or vibration, can cause errors in computations, a problem known as quantum decoherence.

Dozens of companies, both public and private, are now actively advancing the development of large quantum computers. IBM has ambitious plans to build a 100,000-qubit chipset and 100 million gates by the end of this decade.

PsiQuantum aims to achieve 1 million photonic qubits within the same timeframe. Quantum gate fidelities and quantum error correction have also significantly advanced. Gavin Brennen continued:

What all this means is that estimates on the size of quantum computers needed to crack the 256-bit elliptic curve digital signatures used in Bitcoin have dropped from 10-20 million qubits to around a million. One article published by the French quantum startup Alice & Bob estimates that it could be cracked with 126,000 physical qubits, though that does assume a highly specialized error model for the quantum computer. In my opinion, a plausible timeline for cracking 256-bit digital signatures is by the mid-2030s.

Gavin Brennen added that substantial technological improvements would be required to reduce all types of gate errors, connect modules, and combine fast classical and quantum control, which is a challenging but surmountable problem.

Yet, if quantum technology becomes powerful enough to break cryptocurrency security, we may not even know about it, believes Marcos Allende, a quantum physicist and CTO of the LACChain Global Alliance. In an email conversation with Cryptonews, Allende wrote:

What is certain is that those who reach that power first will use it silently, making it impossible to guess that selected hackings are happening because of having quantum computers.

Many scientists remain skeptical about the quantum threat to cryptocurrency. Winfried Hensinger, a physicist at the University of Sussex in Brighton, UK, speaking to Nature magazine, described quantum computers as Theyre all terrible. They cant do anything useful.

Several challenges keep quantum computing from reaching its full potential. The delicate nature of qubits makes it difficult to maintain them in a quantum state for extended periods. Another challenge is cooling requirements. Many quantum processors must operate at temperatures close to absolute zero, which means they need complicated and costly refrigeration technology. Finally, the quantum systems would need to be integrated with the existing classical ones.

Just having 200 million qubits not connected to each other is not going to do anything. There are a lot of fundamental physics problems that need to be resolved before we get there. We are still very much at the beginning. But even in the past year, theres been tremendous improvement. The technology can accelerate in a way that all the timelines will be much shorter than we expect, Itan Barmes told Cryptonews.

Tommie van der Bosch, Partner at Deloitte and Blockchain & Digital Asset Leader of Deloitte North and South Europe, believes that the question is not if quantum computing will break cryptocurrency security but when: The fact that its a possibility is enough to start taking action. You should have a plan.

Indeed, this year several key crypto companies and the World Economic Forum (WEF) have shared concerns about the implications of quantum computing on cryptocurrency security.

The WEF, in its post published in May, warned that central bank digital currency (CBDC) could become a prime target for quantum attacks. Ripples recent report has also said that quantum computers could break the digital signatures that currently protect blockchain assets.

Earlier this year, Buterin, Ethereum founder, suggested the Ethereum blockchain would need to undergo a recovery fork to avoid the scenario when bad actors already have access to them and are able to use them to steal users funds.

To protect against these potential quantum attacks, blockchain systems will need to integrate post-quantum cryptographic algorithms. However, incorporating them into existing blockchain protocols is not easy.

New cryptographic methods must first be developed, tested, and standardized. This process can take years and requires the consensus of the cryptographic community to ensure the new methods are secure and efficient.

In 2016, the National Institute of Standards and Technology (NIST) started a project to set new standards for post-quantum cryptography. The project aims to finalize these standards later this year. In 2022, three digital signature methodsCRYSTALS-Dilithium, FALCON, and SPHINCS+were chosen for standardization.

Once standardized, these new cryptographic algorithms need to be implemented within the blockchains existing framework. After that, all network participants need to adopt the updated protocol.

Itan Barmes explained, Lets say someone could tell us exactly the date, three years from now, when we will have these kinds of quantum computers. How quickly do you think we can change the Bitcoin protocol to make it resilient to these attacks? The decentralized governance of Bitcoin can turn out to be a double-edged sword, by preventing timely action.

Quantum-resistant algorithms often require more processing power and larger key sizes, which could lead to performance issues on the blockchain. These include slower transaction times and increased computational requirements for mining and verification processes.

Tommie van der Bosch told Cryptonews that, ultimately, the rise of quantum computing could affect the entire economic model of cryptocurrencies.

Coins that upgrade to quantum-resistant protocols in time might gain a competitive advantage. Investors and users could prefer these quantum-safe cryptocurrencies, as they may see them as more secure long-term holdings. This shift could lead to an increase in demand for such cryptocurrencies, potentially enhancing their value and market share compared to those that are slower to adapt. Tommie van der Bosch told Cryptonews:

Lets draw a parallel with the banking system. Weve all seen the effects of a bank collapsing or even the rumor of one. Your money suddenly seems at risk. How quickly do people shift their assets? It can trigger a domino effect.

The development of quantum computing could also bring regulatory changes. Regulators could start enforcing stricter standards around trading and custody of cryptocurrencies that havent updated their cryptographic protocols. Such measures would aim to protect investors from sinking funds into potentially vulnerable assets.

Itan Barmes remarked, Not many people are aware that the cryptographic algorithm used in Bitcoin and essentially all cryptocurrencies is not part of the NIST recommendation (NIST SP800-186). The issue is already present if organizations require compliance to NIST standards. The issue becomes even more complex if algorithms need to be replaced; Whos responsibility is it to replace them?

Could quantum computing actually benefit the cryptocurrency industry? Gavin Brennen suggests it might. In an email exchange with Cryptonews, Brennen discussed the development of quantum-enabled blockchains.

Quantum computers could accelerate mining, although Brennen notes that the improvement over traditional mining rigs would be limited and require quantum computers with hundreds of millions of qubitsfar beyond current capabilities.

New computational problems have been suggested, like the boson sampling problem, that are slow for all types of classical computers but would be fast on a quantum device. Interestingly, the boson sampler is a small, specialized processor using photons of light, that is not as powerful as a full quantum computer, but much cheaper to build, and that solves a problem immune to ASIC speedups with an energy footprint that is orders of magnitude lower for reaching PoW consensus.

Currently, proof-of-work (PoW) requires vast amounts of electrical power for mining, raising concerns about sustainability and environmental impact. Boson sampling could become a greener alternative, significantly reducing the energy footprint of blockchain operations while maintaining security and efficiency.

Excerpt from:

Quantum Computers May Break Bitcoin by 2030, But We Won't Know About It - Cryptonews

Posted in Quantum Computing | Comments Off on Quantum Computers May Break Bitcoin by 2030, But We Won’t Know About It – Cryptonews

The next administration must be ready for new quantum encryption standards, MITRE advises – Nextgov/FCW

Posted: at 4:37 pm

The next presidential administration whether it be a second term for current President Joe Biden or former President Donald Trump will have to focus on ensuring the U.S. is ready for quantum computing to outperform the encryption methods currently used to secure data, a top federally-backed research group argues.

MITRE said in an advisory document released last week that the next presidential administration will need to prioritize such quantum computing advances, as well as critical infrastructure protections, clarification of cyber leadership roles and implementation of a zero trust framework for the federal government.

The readout is part of a series of releases from the federally affiliated national security research giant ahead of the upcoming election and possible transition of power in the White House. The release is the first of its kind in the 2024 election season that focuses on U.S. cybersecurity policy.

Todays cryptographic systems rely on complex mathematical algorithms that are difficult for traditional computers to unravel. But future quantum computers could solve these problems much more efficiently because they rely on the laws of quantum mechanics and can process a vast number of possibilities simultaneously. It means malicious actors in the coming years may be augmented with new abilities to decode encrypted information currently considered secure.

Practical quantum computing tools are still in development, though a top NSA official predicted in April that they could be available in three to five years and will likely be accessed in cloud based environments.

While it is hard to predict precisely when quantum computing will crack the currentencryption, the U.S. government must prepare now to protect data past, present, and future in the context of post-quantum cryptography, said the MITRE advisory, referring to a new era of cryptographic algorithms that are designed to be secure against the capabilities of quantum computers. The National Institute of Standards and Technology has been in the process of developing tools to help agencies migrate to PQC standards, as directed by the White House.

The next administration should assess the U.S. governments post-quantum readiness, craft a cryptographic bill of materials to outline what systems need transitioning to PQC and use expertise from the PQC Coalition, MITRE argues.

The White House and intelligence partners have already been working to bolster government network defenses against advanced techniques enabled by the creation of practical quantum computers in the near future. The NSA, in particular, has set a 2035 deadline for IC systems to be locked into these new PQC standards.

Federal scientific thought leaders are trying to prevent quantum-powered cyber incidents like record now, decrypt later attacks where an adversary will hoover up encrypted data streams, store them, and with the eventual existence of a powerful enough quantum device decrypt that data to use for theft or exploitation.

See the article here:

The next administration must be ready for new quantum encryption standards, MITRE advises - Nextgov/FCW

Posted in Quantum Computing | Comments Off on The next administration must be ready for new quantum encryption standards, MITRE advises – Nextgov/FCW

Quantum internet breakthrough after ‘quantum data’ transmitted through standard fiber optic cable for 1st time – Livescience.com

Posted: at 4:37 pm

A new quantum computing study claims that a recent finding in the production, storage and retrieval of "quantum data" has brought us one step closer to the quantum internet.

Currently, quantum information is unstable over long distances and quantum bits, or qubits the carriers of quantum information are easily lost or fragmented during transmission.

Classical computer bits are transmitted today as pulses of light through fiber optic cables using devices called "repeaters" to amplify signals across the length of the network. To transmit qubits over longer distances the way classical computer bits are transmitted today we need similar devices that can store and retransmit quantum states across the whole network, ensuring signal fidelity no matter how far the data has to go.

These quantum memory devices could receive, store and retransmit qubit states. The new study, conducted at Imperial College London, the University of Southampton, and the Universities of Stuttgart and Wurzburg in Germany, claims to have achieved this using standard fiber optic cables for the first time. The findings were published April 12 in the journal Scientific Advances.

The researchers stored and retrieved photons one of the potential carriers of quantum information using a new and potentially much more efficient method.

"There are two main types of single photon sources,a process called non-linear optical frequency conversion and those based on single emitters," Sarah Thomas, professor of physics at Imperial College, London, told Live Science. "It's been demonstrated many times before that we can store photons from nonlinear optics in a quantum memory because you can engineer the source and memory to match. We used a particular single emitter called a quantum dot, which is a nanocrystal of semiconductors."

Thomas said that using nonlinear optics is less reliable a pair of usable photons isn't produced every time, whereas a single emitter quantum dot produces them at a higher rate.

Get the worlds most fascinating discoveries delivered straight to your inbox.

Related: Bizarre device uses 'blind quantum computing' to let you access quantum computers from home

The next challenge is that the efficiency of the interface between quantum memory devices depends on matching both the wavelength and bandwidth. Discrepancies here make storage and retrieval too inefficient, but the study finally bridged the gap.

"We did it by using a high-bandwidth, low-noise quantum memory, fabricating the photon source at a very specific wavelength to match our quantum memory," Thomas said. "We were also able to do it at a wavelength where the loss in optical fiber is the lowest, which will be key in the future for building quantum networks."

But this is not the only recent advance in quantum computing and the quantum internet. In February, Live Science reported on a related breakthrough at Stony Brook University.

Quantum network models are more stable at extremely low temperatures, which limits their real-world applications, but the study achieved a stable connection at room temperature, which puts it within reach of real-world use.

The Imperial study builds on that success thanks to the aligned wavelengths between transmitter and receiver.

"The Stony Brook study used photons at 795 nm [nanometers] and showed interference of two photons after storage and retrieval," Mark Saffman, chief scientist for quantum information at quantum-enabled products company Infleqtion told Live Science. "The Imperial study used a photon at 1529 nm (which is the standard telecom wavelength) and stored and retrieved it, but didn't show interference. The storage and retrieval of telecom wavelength is important for low-loss fiber transmission. Both studies advance different aspects of what's needed for a quantum network."

Michael Hasse, a cybersecurity expert (one of the areas where quantum networks will have the most impact) told Live Science that the Imperial study describes a method whereas the earlier study describes a mechanism necessary for that method to work.

"The Imperial work is about a means of establishing long-distance communication using repeaters," he said. "Quantum entanglement allows communications to be far apart in theory, but in reality it's easier when they're closer together. The Stony Brook study refers to the storage of quantum information at room temperature, which is necessary for cost-effective implementation of repeaters."

Continued here:

Quantum internet breakthrough after 'quantum data' transmitted through standard fiber optic cable for 1st time - Livescience.com

Posted in Quantum Computing | Comments Off on Quantum internet breakthrough after ‘quantum data’ transmitted through standard fiber optic cable for 1st time – Livescience.com

European telecoms leading the way in quantum tech adoption, report finds – TNW

Posted: at 4:37 pm

Say quantum technologies and most people probably still imagine something decades into the future. But, as a new report released today demonstrates, quantum is already here especially as it relates to the telecom industry.

After years of incremental progress confined to research institutions, the emerging quantum technology sector has begun to gather commercial momentum. While most of the developments have been related to the quantum computing domain and its future promises, there are many other use cases for quantum tech applicable already today.

Quantum communications, including networks and forms of encryption, are currently being commercialised by a growing number of major telecom industry players and startups throughout the world. And Europe has a major part to play.

According to a report released today by Infinity, a startup and ecosystem support branch of Quantum Delta NL, 32% of the 100 quantum startups, scaleups, and SMEs servicing the telecom and telecom infrastructure sector are based in continental Europe. Germany, the Netherlands, France, Switzerland, and Spain are the strongest ecosystems. An additional 14% are in the UK and Ireland.

In addition, 50% of the enterprises that serve as consumers of the technology are located in continental Europe, with a further 11% in the UK and Ireland. Indeed, there are already more than 25 quantum networks being deployed in Europe today.

This includes a commercial quantum network in London, launched through a partnership between BT and Toshiba Europe, and an EU-wide quantum communications network being developed by Deutsche Telekom and two consortia called Petrus and Nostradamus.

Telecom companies are becoming a driving force for real-world adoption of quantum technology, said Teun van der Veen, Quantum Lead at the Netherlands Organisation for Applied Scientific Research (TNO). They are at the forefront of integrating quantum into existing infrastructures and for them it is all about addressing end-user needs.

Quantum networks utilise the unique properties of quantum mechanics such as superposition and entanglement to connect systems and transmit data securely. This is done through quantum channels, which can be implemented using optical fibres, free-space optics, or satellite links.

The promise of quantum networks and quantum encryption is that they would be near-impossible, if not entirely impossible, to hack, thus offering ultra-secure forms of communication.

As Infinitys report states, they can be used to establish quantum-secure links between data centres, Earth and spacecraft and satellites, military and governments, trains and rail network control centres, hospital and health care sites, etc.

Quantum networks can also form the backbone of a global quantum internet, connecting quantum computers in different locations. Furthermore, they can offer opportunities for blind cloud quantum computing, which keeps quantum operations a secret to everyone but the user.

With geopolitical tensions on the rise and looming cybersecurity threats, companies and governments are increasingly looking into ways of securing IT infrastructure and data.

Perhaps unsurprisingly then, Infinitys report finds that Quantum Key Distribution (QKD) is the most popular use of quantum technology in the telecom sector. QKD utilises quantum mechanics to allow parties to generate a key that is known only to them, and is used to encrypt and decrypt messages.

One startup that knows a lot about QKD technology is Q*Bird. The Delft-based communications security company just raised 2.5mn to further develop its QKD product Falqon, already in trial with the Port of Rotterdam (the largest port in Europe).

Quantum communications solutions see increased interest across digital infrastructure in the EU, said Ingrid Romijn, co-founder and CEO of Q*Bird. Together with partners like Cisco, Eurofiber, Intermax, Single Quantum, Portbase and InnovationQuarter, Q*Bird is already testing quantum secure communications in the Port of Rotterdam using our novel quantum cryptography (QKD) technology.

Romjin further stated that moving forward, more industries and companies will be able to implement scalable solutions protecting data communications, leveraging next-generation QKD technology.

Another technology garnering interest is post-quantum cryptography (PQC). Q-day (the day when a quantum computer breaks the internet) is, in all probability, still some way into the future.

However, most classical cryptography methods will be vulnerable to hacking from a sufficiently powerful quantum computer sooner. PQC algorithms are designed to be secure against both classical and quantum attacks.

Other technologies with potential applications for the telecom industry are quantum sensors, clocks, simulation, random number generation, and, naturally, quantum computing.

Meanwhile, despite the increasing market interest, the report also finds that Europes quantum technology startups require more support and investment to help achieve the technical and market breakthroughs to drive the field forward.

Currently, only 42% of the quantum tech for telecom startups worldwide have external funding, having raised a total of 1.9bn between them.And despite the relative forward-thinking approach of the EU as demonstrated by the Deutsche Telekom network project, the US still leads in terms of private sector activity and investment.

Other challenges include raising awareness among business leaders, increasing skilled workforce, overcoming technical limitations, and building a stronger business narrative.

These can be surmounted partially through more regulatory standardisation, more collaboration with industry, and more early-stage support and investment for startups, the report says.

The key market opportunities for the quantum communications sector going forward are in government bodies including military and security services, financial institutions, and critical infrastructure departments, as well as companies in the energy, defence, space, and technology sectors.

Growing collaboration between enterprises and startups in telecom signals the industrys commitment to integrating quantum solutions into commercial applications, said Pavel Kalinin, Operations and Platforms Lead at Infinity. Successful implementation of such technologies will depend on coordinated efforts to prepare the workforce, facilitate collaborations, and set industry benchmarks and standards.

You can read the report in its entirety here.

See the original post:

European telecoms leading the way in quantum tech adoption, report finds - TNW

Posted in Quantum Computing | Comments Off on European telecoms leading the way in quantum tech adoption, report finds – TNW

Simulating Quantum Circuits with Light-Induced Magnetism – AZoQuantum

Posted: at 4:37 pm

Jun 12 2024Reviewed by Lexie Corner

Researchers from the Graz University of Technology have calculated how suitablemolecules can be excited by pulses of infrared light to generate magnetic fields. The research, published in the Journal of the American Chemical Society, will help in the construction of quantum computing circuits.

Molecules exposed to infrared radiation start to vibrate because of the energy source. This well-known phenomenon prompted Andreas Hauser of the Institute of Experimental Physics at Graz University of Technology (TU Graz) to investigate whether these oscillations could also be exploited to produce magnetic fields.

This is due to the positively charged nature of atomic nuclei and the creation of magnetic fields when charged particles move. Andreas Hauser and colleagues have now determined that, when infrared pulses operate on metal phthalocyanines, ring-shaped, planar dye molecules, these molecules generate minuscule magnetic fields in the nm range due to their high symmetry.

The calculations suggest that nuclear magnetic resonance spectroscopy could be used to determine the relatively low but highly accurately localized field strength.

The team used modern electron structure theory on supercomputers at the Vienna Scientific Cluster and TU Graz to calculate how phthalocyanine molecules behave when irradiated with circularly polarized infrared light.

They also drew on preliminary work from the early days of laser spectroscopy, some of which were decades old. The circularly polarized, or helically twisted, light waves excited two simultaneous molecular vibrations at right angles to one another.

As every rumba dancing couple knows, the right combination of forwards-backwards and left-right creates a small, closed loop. And this circular movement of each affected atomic nucleus actually creates a magnetic field, but only very locally, with dimensions in the range of a few nanometers.

Andreas Hauser, Institute of Experimental Physics, Graz University of Technology

According to Andreas Hauser, it is even possible to regulate the magnetic field's strength and direction by carefully adjusting the infrared light. As a result, the molecules would become high-precision optical switches that couldbe utilized to construct quantum computer circuits.

Andreas Hauser is working with colleagues at the TU Graz Institute of Solid-State Physics and a group at the University of Graz to demonstratethat controlled generation of molecule magnetic fields is possible.

For proof, but also for future applications, the phthalocyanine molecule needs to be placed on a surface. However, this changes the physical conditions, which in turn influences the light-induced excitation and the characteristics of the magnetic field, we therefore want to find a support material that has minimal impact on the desired mechanism.

Andreas Hauser, Institute of Experimental Physics, Graz University of Technology

Before testing the most promising versions in experiments, the physicist and his associates wish to compute the interactions between the deposited phthalocyanines, the support material, and the infrared light in a subsequent stage.

Wilhelmer, R., et al. (2024) Molecular Pseudorotation in Phthalocyanines as a Tool for Magnetic Field Control at the Nanoscale. Journal of the American Chemical Society. doi.org/10.1021/jacs.4c01915

Read the original post:

Simulating Quantum Circuits with Light-Induced Magnetism - AZoQuantum

Posted in Quantum Computing | Comments Off on Simulating Quantum Circuits with Light-Induced Magnetism – AZoQuantum

Page 3«..2345..1020..»