Table of Contents
- Introduction
- Classical Boltzmann Machines Recap
- From Classical to Quantum Boltzmann Machines
- Structure of a Quantum Boltzmann Machine (QBM)
- Quantum Energy-Based Models
- Hamiltonian Representation in QBM
- Quantum States as Probability Distributions
- QBM vs Classical RBM and DBM
- Restricted Quantum Boltzmann Machines (RQBM)
- Training Quantum Boltzmann Machines
- Parameter Optimization with Contrastive Divergence
- Quantum Sampling and Measurement
- Gradient Estimation in QBMs
- QBM Implementation with D-Wave and Annealers
- QBM Implementation in Variational Quantum Circuits
- Applications of QBMs
- Challenges in Realizing QBMs
- Hybrid Quantum-Classical Strategies for QBM Training
- Software Frameworks Supporting QBM Models
- Conclusion
1. Introduction
Quantum Boltzmann Machines (QBMs) are quantum analogs of classical Boltzmann Machines, capable of representing and learning probability distributions using quantum mechanical systems. They are designed to harness quantum superposition and entanglement for generative modeling and statistical learning.
2. Classical Boltzmann Machines Recap
- Energy-based probabilistic models
- Represent distributions using energy functions:
\[
P(v) = rac{1}{Z} \sum_h e^{-E(v, h)}
\] - Often trained using contrastive divergence
3. From Classical to Quantum Boltzmann Machines
- Replace classical energy function with a quantum Hamiltonian
- Use density matrices to represent probabilistic quantum states
- Training involves manipulating Hamiltonian parameters
4. Structure of a Quantum Boltzmann Machine (QBM)
- Nodes: visible and hidden qubits
- Energy: represented by quantum Hamiltonian \( H \)
- Quantum state: \(
ho = e^{-eta H} / Z \)
5. Quantum Energy-Based Models
- Define a Gibbs state:
\[
ho = rac{e^{-eta H}}{Z}, \quad Z = ext{Tr}(e^{-eta H})
\]
- The Hamiltonian encodes interactions between qubits
6. Hamiltonian Representation in QBM
Common form:
\[
H = \sum_i a_i Z_i + \sum_{i<j} b_{ij} Z_i Z_j + \sum_{i<j} c_{ij} X_i X_j
\]
- Includes transverse fields (X), longitudinal fields (Z)
7. Quantum States as Probability Distributions
- Measurement collapses superposition into observable outcomes
- Gibbs distribution approximated by repeated sampling
8. QBM vs Classical RBM and DBM
Model | Visibility | Training Strategy | Representation |
---|---|---|---|
RBM | Bipartite | Contrastive Divergence | Classical |
DBM | Deep Layers | Greedy Layer-wise | Classical |
QBM | Quantum Qubits | Quantum Annealing / Variational | Quantum Density Matrices |
9. Restricted Quantum Boltzmann Machines (RQBM)
- Constrain connections to visible-hidden only
- Easier to train and simulate
- Analogous to RBMs
10. Training Quantum Boltzmann Machines
- Use cost functions like KL divergence
- Maximize likelihood or minimize free energy
- Variational techniques to optimize parameters
11. Parameter Optimization with Contrastive Divergence
- Approximate gradient by difference in data-driven and model-driven samples
- Quantum version uses variational circuits to compute loss and gradients
12. Quantum Sampling and Measurement
- Use quantum annealers (D-Wave) or variational sampling
- Multiple measurements used to construct probability distributions
13. Gradient Estimation in QBMs
- Use parameter-shift rule for gradients:
\[
rac{\partial \langle H
angle}{\partial heta} = rac{\langle H( heta + \pi/2)
angle – \langle H( heta – \pi/2)
angle}{2}
\]
14. QBM Implementation with D-Wave and Annealers
- D-Wave supports native Ising Hamiltonians
- Limited control over entanglement and gate-level access
15. QBM Implementation in Variational Quantum Circuits
- Use QAOA or VQE-style circuits to approximate Gibbs state
- Optimize circuit parameters via classical feedback
16. Applications of QBMs
- Generative modeling
- Data compression
- Quantum chemistry
- Anomaly detection
- Quantum-inspired autoencoders
17. Challenges in Realizing QBMs
- State preparation and measurement fidelity
- Gradient vanishing and barren plateaus
- Complexity of simulating thermal states
18. Hybrid Quantum-Classical Strategies for QBM Training
- Classical preprocessing (feature selection, PCA)
- Quantum inference/sampling
- Use hybrid loss functions combining classical and quantum observables
19. Software Frameworks Supporting QBM Models
- PennyLane (variational circuits)
- D-Wave Ocean SDK (annealing)
- TensorFlow Quantum (experimental QBM-like workflows)
20. Conclusion
Quantum Boltzmann Machines provide a powerful framework for learning probability distributions on quantum hardware. Although still in early development, QBMs illustrate the potential of quantum energy-based learning models and pave the way toward quantum-native generative AI.