🎨 Quantum Generative Models

Creating synthetic data with quantum computing

Your Progress

0 / 5 completed
Previous Module
Quantum Kernel Methods

🌟 Quantum Creativity

Generative models learn to create new data samples from training distributions—images, molecules, text. Quantum generative models leverage superposition and entanglement to represent and sample from exponentially complex probability distributions, potentially creating synthetic data classical models cannot.

💡 Why Quantum Generation?

Classical GANs struggle with mode collapse and require millions of parameters. Quantum circuits naturally represent probability distributions via Born's rule: P(x) = |⟨x|ψ⟩|². A 10-qubit circuit encodes 2¹⁰ = 1024 probabilities; 20 qubits = 1 million. Quantum sampling provides exponential capacity with polynomial resources.

Classical GAN (small):
~10⁶ parameters
Classical GAN (large):
~10⁹ parameters
Quantum GAN (10 qubits):
~50 parameters, 2¹⁰ states
Quantum GAN (20 qubits):
~100 parameters, 2²⁰ states

🎯 What You'll Learn

⚛️
QGAN Architecture
Quantum generator + discriminator
🧬
Born Machines
Direct quantum probability sampling
🔄
Quantum VAEs
Variational autoencoders
🚀
Real Applications
Molecules, images, finance

📊 Generative Paradigms

🧠Classical Models
GANs:Adversarial training
VAEs:Variational inference
Diffusion:Denoising process
⚛️Quantum Models
QGANs:Quantum generator
Born Machines:Circuit sampling
QVAEs:Quantum latent space

🔬 Key Insight: Born's Rule

Quantum states |ψ⟩ define probability distributions via P(x) = |⟨x|ψ⟩|²—Born's rule. By training quantum circuits to prepare states matching target distributions, we create generative models naturally. Sampling from |ψ⟩ is exponentially easier than computing all probabilities classically.

|ψ⟩ = Σ αᵢ|i⟩ → P(i) = |αᵢ|²