🎯 Quantum Support Vector Machines

Exponential speedup for high-dimensional classification

Your Progress

0 / 5 completed
Previous Module
Quantum PCA

🌟 Quantum Classification Power

Support Vector Machines (SVMs) find optimal decision boundaries by maximizing margins between classes—achieving state-of-the-art accuracy for text, images, and genomics. Quantum SVMs (QSVMs) leverage quantum kernels and amplitude estimation to classify in exponentially high-dimensional feature spaces, exponentially faster than classical methods.

💡 Why Quantum SVMs?

Classical SVMs compute kernel matrices K[i,j] = k(xᵢ, xⱼ) in O(n²d) time for n samples, d features. With quantum feature maps ϕ: x → |ϕ(x)⟩, we estimate inner products ⟨ϕ(xᵢ)|ϕ(xⱼ)⟩ via quantum kernel estimation in O(log n) time per entry—exponential speedup for high-dimensional kernels.

Classical SVM (10K samples):
~10⁸ kernel evaluations
Quantum SVM (10K samples):
~10⁴ quantum circuits

🎯 What You'll Learn

⚛️
Quantum Kernels
Map data to Hilbert space
📐
Quantum Least-Squares
HHL algorithm for optimization
🧬
Feature Map Design
ZZ, Pauli, amplitude encoding
🚀
Real Applications
Drug discovery, fraud detection

📊 Classical vs Quantum SVM

Classical Kernel SVMO(n² × d)

Compute K = k(X, X) → solve dual: max Σαᵢ - ½ΣαᵢαⱼyᵢyⱼK[i,j] → predict

Quantum SVMO(log(n) × poly(log d))

Encode X → quantum kernel K_Q → HHL solve → quantum amplitude estimation

🔬 Key Insight: Quantum Feature Spaces

Quantum circuits naturally create exponentially large feature spaces. A 10-qubit circuit maps x → |ϕ(x)⟩ into 2¹⁰ = 1024 dimensions; 20 qubits = 1 million. Classical kernels can't compete—quantum feature advantage enables learning complex, non-linear boundaries efficiently.