Quantum Computing can revolutionize our ability to simulate the natural world
Yet a lot of QC experts have given up and moved to other industries, believing a useful QC platform won't be here until <2040.
Can QC be saved this decade? Yes.
Here’s my contrarian QC thread 🧵
Quantum mechanics dominates the world of the very small, but determines properties we measure macroscopically. Nowhere is this more important than materials science.
Yet simulating crystal formation is a profoundly difficult task for classical computing. Why is it so difficult?
To accurately predict material properties we must understand that crystal structure depends on the electronic orbitals of individual atoms.
Predicting orbitals interactions means solving the Many-Body Schrodinger Equation, an impossible task for classical computing
The advantage of QC is that computing hardware naturally embodies the quantum dynamics governing electron orbital interactions. There is a direct mapping between the simulation to be solved and the simulator itself.
Explore many possible solutions concurrently
The issue with QC currently is that we haven’t learned to fully control for noise that creeps into the system. Classical computing averages thermal fluctuations out by having computing elements much larger than these fluctuations.
QC elements are necessarily small and so necessarily sensitive
This is where the pessimism about QC comes in: building a Fault Tolerant Error Corrected QC will take millions of qubits. Google recently wowed us with their Willow chip that has 100 qubits.
To go from 50 to 100 qubits took 5 years. We need a million,
The reality is that progress in the number of usable physical qubits is painstakingly slow, forever pushing forward in time the point where quantum computers have supremacy over classical for certain problems.
Many have left QC for other computing fields or quantum sensing
The question is, can quantum supremacy be accelerated faster than the pace of hardware development increases the number of usable qubits?
The answer is yes, if we understand how noise affects quantum gate operations
Single qubit gates like X,Y,Z rotations are the least noisy because only one qubit is being manipulated. Error rates are ~ 0.1%.
Two qubit gates like CNOT or SWAP have error rates 5-10 times higher
Three qubit gates are so noisy they're swappd out for sets of smaller gates
There is a vast number of ways of implementing a given functional circuit with a specific set of quantum gates, that trade-off against required depth of circuit and gate properties.
These trade-offs also depend on which QC hardware modality you're running on, and theres several
Superconducting, neutral atom, and trapped ion are the most popular approaches. Neutral atom and trapped ion devices are more stable and coherent for longer, and the qubits have better connectivity making them suitable for simulation problems with longer range interactions.
In contrast superconducting qubits have more cross-talk between elements but operate faster and are easier to scale up, making them better suited for the high volume workloads in quantum machine learning. Shallow algorithms executing very quickly.
There’s more than just those approaches however, each with a specific set of trade-offs that can make that approach competitive for a specific class of application or problem to be solved.
Balancing hardware trade-offs, gate construction, error mitigation and executing algorithms to minimize noise is extremely non-trivial.
But the win is huge: accelerating usable QC from decades to just years away
Turns out there's a company doing exactly this: @Haiqu_inc
By optimizing construction and connectivity of specific quantum gates, and executing them in an approximate way that is inherently resilient to noise, @Haiqu_Inc can increase the depth - or longest sequence of gate elements - by up to 100x
Haiqu does for Quantum Processing Units (QPUs) what CUDA programming did for GPUs - creates a standard interface that is hardware agnostic, so engineers can use the latest hardware without worrying about implementation and execution details.
And then there’s the noise…
The resulting performance improvement is profound;
Increasing the depth of an executable circuit means we don’t have to wait decades to use QC to simulate the natural world.
Now, there's two main types of simulation:
End-to-End circuits involve many sequential gate operations where the gate operations don't change as the simulation runs.
Since noise accumulates across gate operations these are especially susceptible to decoherence, when noise swamps the solution and results are useless
Optimizing the execution means limiting the effects of noise that creep into the computation process by methods like re-designing the circuit, replacing or combining gates, or performing measurements in a subspace that is noise resilient
This last method is known as constructing a decoherence free subspace: choosing a set of basis vectors that will transform normally under quantum operations yet are resilient to the noise that accumulates through operations, increasing stability
Increasing the depth of an End-to-End circuit brings QC methods into the realm of usability for simulations like Computational Fluid Dynamics and Spectroscopy by modeling non-linear many-body interactions as they evolve through time, mapping these to quantum dynamics
The other class of methods that @Haiqu_Inc can greatly improve with their software layer is Variational Quantum Computing, integral to Quantum Machine Learning and where the real magic begins in unlocking materials science:
QVC uses a hybrid of QC and classical computing to evolve parameters governing quantum gates as the sim runs. Just like normal ML, QML optimizes a solution under a loss function, for QC the energy of the system
QM systems want to naturally evolve to find the lowest energy state
This makes QML the ideal method for simulating quantum chemistry experiments, like finding the lowest energy configuration of a crystal lattice, and therefore the ability to design revolutionary new materials from first principles.
What are the issues?
QVC methods don’t require the depth of End-to-End circuits but do need to execute many times on shallow circuits, meaning noise still creeps in and affects the fidelity of the simulation over time.
Noise produces 'barren plateaus,' regions where the gradient appears to vanish and the solution can no longer converge.
solves this via the combination of especially light-weight error mitigation and noise-resilient trainability techniques that make QML possible at scale. This reduces the ‘barren plateaus’ or vanishing gradients that plague QVC ML methods to reach convergence faster
An issue common to all QC applications is data loading, since the representation of the data on the circuit is susceptible to noise. Haiqu recently demonstrated an efficient and scalable data loading method with HSBC, loading the largest financial dataset to-date on IBM QPUs
Taken together Haiqu’s techniques for enhancing execution on the QPU bring applications into the realm of usability on current noisy-intermediate quantum computing devices, a decade or more ahead of fully corrected fault tolerant QC.
What’s the net-net for users?
The goal with Haiqu is to take care of all the complexities and trade-offs inherent in translating a simulation goal to an implementable quantum circuit and executing it, continuously incorporating algorithms and hardware improvements and bringing them to end-level users
This goes far beyond just noise-mitigation and smart execution strategies: it’s the entire stack between “I want to simulate something” and having that simulation run in a performance-optimized manner on available QC hardware.
Quantum Computing can revolutionize our ability to simulate the natural world
Yet a lot of QC experts have given up and moved to other industries, believing a useful QC platform won't be here until <2040.
Can QC be saved this decade? Yes.
Here’s my contrarian QC thread 🧵Quantum mechanics dominates the world of the very small, but determines properties we measure macroscopically. Nowhere is this more important than materials science.
Yet simulating crystal formation is a profoundly difficult task for classical computing. Why is it so difficult?To accurately predict material properties we must understand that crystal structure depends on the electronic orbitals of individual atoms.
Predicting orbitals interactions means solving the Many-Body Schrodinger Equation, an impossible task for classical computingThe advantage of QC is that computing hardware naturally embodies the quantum dynamics governing electron orbital interactions. There is a direct mapping between the simulation to be solved and the simulator itself.
Explore many possible solutions concurrentlyThe issue with QC currently is that we haven’t learned to fully control for noise that creeps into the system. Classical computing averages thermal fluctuations out by having computing elements much larger than these fluctuations.
QC elements are necessarily small and so necessarily sensitiveThis is where the pessimism about QC comes in: building a Fault Tolerant Error Corrected QC will take millions of qubits. Google recently wowed us with their Willow chip that has 100 qubits.
To go from 50 to 100 qubits took 5 years. We need a million,The reality is that progress in the number of usable physical qubits is painstakingly slow, forever pushing forward in time the point where quantum computers have supremacy over classical for certain problems.
Many have left QC for other computing fields or quantum sensingThe question is, can quantum supremacy be accelerated faster than the pace of hardware development increases the number of usable qubits?
The answer is yes, if we understand how noise affects quantum gate operationsSingle qubit gates like X,Y,Z rotations are the least noisy because only one qubit is being manipulated. Error rates are ~ 0.1%.
Two qubit gates like CNOT or SWAP have error rates 5-10 times higher
Three qubit gates are so noisy they're swappd out for sets of smaller gatesThere is a vast number of ways of implementing a given functional circuit with a specific set of quantum gates, that trade-off against required depth of circuit and gate properties.
These trade-offs also depend on which QC hardware modality you're running on, and theres severalSuperconducting, neutral atom, and trapped ion are the most popular approaches. Neutral atom and trapped ion devices are more stable and coherent for longer, and the qubits have better connectivity making them suitable for simulation problems with longer range interactions.In contrast superconducting qubits have more cross-talk between elements but operate faster and are easier to scale up, making them better suited for the high volume workloads in quantum machine learning. Shallow algorithms executing very quickly.There’s more than just those approaches however, each with a specific set of trade-offs that can make that approach competitive for a specific class of application or problem to be solved.Balancing hardware trade-offs, gate construction, error mitigation and executing algorithms to minimize noise is extremely non-trivial.
But the win is huge: accelerating usable QC from decades to just years away
Turns out there's a company doing exactly this: @Haiqu_incBy optimizing construction and connectivity of specific quantum gates, and executing them in an approximate way that is inherently resilient to noise, @Haiqu_Inc can increase the depth - or longest sequence of gate elements - by up to 100xHaiqu does for Quantum Processing Units (QPUs) what CUDA programming did for GPUs - creates a standard interface that is hardware agnostic, so engineers can use the latest hardware without worrying about implementation and execution details.
And then there’s the noise…The resulting performance improvement is profound;
Increasing the depth of an executable circuit means we don’t have to wait decades to use QC to simulate the natural world.
Now, there's two main types of simulation:End-to-End circuits involve many sequential gate operations where the gate operations don't change as the simulation runs.
Since noise accumulates across gate operations these are especially susceptible to decoherence, when noise swamps the solution and results are uselessOptimizing the execution means limiting the effects of noise that creep into the computation process by methods like re-designing the circuit, replacing or combining gates, or performing measurements in a subspace that is noise resilientThis last method is known as constructing a decoherence free subspace: choosing a set of basis vectors that will transform normally under quantum operations yet are resilient to the noise that accumulates through operations, increasing stabilityIncreasing the depth of an End-to-End circuit brings QC methods into the realm of usability for simulations like Computational Fluid Dynamics and Spectroscopy by modeling non-linear many-body interactions as they evolve through time, mapping these to quantum dynamicsThe other class of methods that @Haiqu_Inc can greatly improve with their software layer is Variational Quantum Computing, integral to Quantum Machine Learning and where the real magic begins in unlocking materials science:QVC uses a hybrid of QC and classical computing to evolve parameters governing quantum gates as the sim runs. Just like normal ML, QML optimizes a solution under a loss function, for QC the energy of the system
QM systems want to naturally evolve to find the lowest energy stateThis makes QML the ideal method for simulating quantum chemistry experiments, like finding the lowest energy configuration of a crystal lattice, and therefore the ability to design revolutionary new materials from first principles.
What are the issues?QVC methods don’t require the depth of End-to-End circuits but do need to execute many times on shallow circuits, meaning noise still creeps in and affects the fidelity of the simulation over time.Noise produces 'barren plateaus,' regions where the gradient appears to vanish and the solution can no longer converge.solves this via the combination of especially light-weight error mitigation and noise-resilient trainability techniques that make QML possible at scale. This reduces the ‘barren plateaus’ or vanishing gradients that plague QVC ML methods to reach convergence fasterAn issue common to all QC applications is data loading, since the representation of the data on the circuit is susceptible to noise. Haiqu recently demonstrated an efficient and scalable data loading method with HSBC, loading the largest financial dataset to-date on IBM QPUsTaken together Haiqu’s techniques for enhancing execution on the QPU bring applications into the realm of usability on current noisy-intermediate quantum computing devices, a decade or more ahead of fully corrected fault tolerant QC.
What’s the net-net for users?The goal with Haiqu is to take care of all the complexities and trade-offs inherent in translating a simulation goal to an implementable quantum circuit and executing it, continuously incorporating algorithms and hardware improvements and bringing them to end-level usersThis goes far beyond just noise-mitigation and smart execution strategies: it’s the entire stack between “I want to simulate something” and having that simulation run in a performance-optimized manner on available QC hardware.
yes
Quantum Computing can revolutionize our ability to simulate the natural world
Yet a lot of QC experts have given up and moved to other industries, believing a useful QC platform won't be here until <2040.
Can QC be saved this decade? Yes.
Here’s my contrarian QC thread 🧵 ... Quantum mechanics dominates the world of the very small, but determines properties we measure macroscopically. Nowhere is this more important than materials science.
Yet simulating crystal formation is a profoundly difficult task for classical computing. Why is it so difficult? ... To accurately predict material properties we must understand that crystal structure depends on the electronic orbitals of individual atoms.
Predicting orbitals interactions means solving the Many-Body Schrodinger Equation, an impossible task for classical computing ... The advantage of QC is that computing hardware naturally embodies the quantum dynamics governing electron orbital interactions. There is a direct mapping between the simulation to be solved and the simulator itself.
Explore many possible solutions concurrently ... The issue with QC currently is that we haven’t learned to fully control for noise that creeps into the system. Classical computing averages thermal fluctuations out by having computing elements much larger than these fluctuations.
QC elements are necessarily small and so necessarily sensitive ... This is where the pessimism about QC comes in: building a Fault Tolerant Error Corrected QC will take millions of qubits. Google recently wowed us with their Willow chip that has 100 qubits.
To go from 50 to 100 qubits took 5 years. We need a million, ... The reality is that progress in the number of usable physical qubits is painstakingly slow, forever pushing forward in time the point where quantum computers have supremacy over classical for certain problems.
Many have left QC for other computing fields or quantum sensing ... The question is, can quantum supremacy be accelerated faster than the pace of hardware development increases the number of usable qubits?
The answer is yes, if we understand how noise affects quantum gate operations ... Single qubit gates like X,Y,Z rotations are the least noisy because only one qubit is being manipulated. Error rates are ~ 0.1%.
Two qubit gates like CNOT or SWAP have error rates 5-10 times higher
Three qubit gates are so noisy they're swappd out for sets of smaller gates ... There is a vast number of ways of implementing a given functional circuit with a specific set of quantum gates, that trade-off against required depth of circuit and gate properties.
These trade-offs also depend on which QC hardware modality you're running on, and theres several ... Superconducting, neutral atom, and trapped ion are the most popular approaches. Neutral atom and trapped ion devices are more stable and coherent for longer, and the qubits have better connectivity making them suitable for simulation problems with longer range interactions. ... In contrast superconducting qubits have more cross-talk between elements but operate faster and are easier to scale up, making them better suited for the high volume workloads in quantum machine learning. Shallow algorithms executing very quickly. ... There’s more than just those approaches however, each with a specific set of trade-offs that can make that approach competitive for a specific class of application or problem to be solved. ... Balancing hardware trade-offs, gate construction, error mitigation and executing algorithms to minimize noise is extremely non-trivial.
But the win is huge: accelerating usable QC from decades to just years away
Turns out there's a company doing exactly this: @Haiqu_inc ... By optimizing construction and connectivity of specific quantum gates, and executing them in an approximate way that is inherently resilient to noise, @Haiqu_Inc can increase the depth - or longest sequence of gate elements - by up to 100x ... Haiqu does for Quantum Processing Units (QPUs) what CUDA programming did for GPUs - creates a standard interface that is hardware agnostic, so engineers can use the latest hardware without worrying about implementation and execution details.
And then there’s the noise… ... The resulting performance improvement is profound;
Increasing the depth of an executable circuit means we don’t have to wait decades to use QC to simulate the natural world.
Now, there's two main types of simulation: ... End-to-End circuits involve many sequential gate operations where the gate operations don't change as the simulation runs.
Since noise accumulates across gate operations these are especially susceptible to decoherence, when noise swamps the solution and results are useless ... Optimizing the execution means limiting the effects of noise that creep into the computation process by methods like re-designing the circuit, replacing or combining gates, or performing measurements in a subspace that is noise resilient ... This last method is known as constructing a decoherence free subspace: choosing a set of basis vectors that will transform normally under quantum operations yet are resilient to the noise that accumulates through operations, increasing stability ... Increasing the depth of an End-to-End circuit brings QC methods into the realm of usability for simulations like Computational Fluid Dynamics and Spectroscopy by modeling non-linear many-body interactions as they evolve through time, mapping these to quantum dynamics ... The other class of methods that @Haiqu_Inc can greatly improve with their software layer is Variational Quantum Computing, integral to Quantum Machine Learning and where the real magic begins in unlocking materials science: ... QVC uses a hybrid of QC and classical computing to evolve parameters governing quantum gates as the sim runs. Just like normal ML, QML optimizes a solution under a loss function, for QC the energy of the system
QM systems want to naturally evolve to find the lowest energy state ... This makes QML the ideal method for simulating quantum chemistry experiments, like finding the lowest energy configuration of a crystal lattice, and therefore the ability to design revolutionary new materials from first principles.
What are the issues? ... QVC methods don’t require the depth of End-to-End circuits but do need to execute many times on shallow circuits, meaning noise still creeps in and affects the fidelity of the simulation over time. ... Noise produces 'barren plateaus,' regions where the gradient appears to vanish and the solution can no longer converge. ... solves this via the combination of especially light-weight error mitigation and noise-resilient trainability techniques that make QML possible at scale. This reduces the ‘barren plateaus’ or vanishing gradients that plague QVC ML methods to reach convergence faster ... An issue common to all QC applications is data loading, since the representation of the data on the circuit is susceptible to noise. Haiqu recently demonstrated an efficient and scalable data loading method with HSBC, loading the largest financial dataset to-date on IBM QPUs ... Taken together Haiqu’s techniques for enhancing execution on the QPU bring applications into the realm of usability on current noisy-intermediate quantum computing devices, a decade or more ahead of fully corrected fault tolerant QC.
What’s the net-net for users? ... The goal with Haiqu is to take care of all the complexities and trade-offs inherent in translating a simulation goal to an implementable quantum circuit and executing it, continuously incorporating algorithms and hardware improvements and bringing them to end-level users ... This goes far beyond just noise-mitigation and smart execution strategies: it’s the entire stack between “I want to simulate something” and having that simulation run in a performance-optimized manner on available QC hardware.
Missing some Tweet in this thread? You can try to
Update