Resources:
- qiskit-community, “GitHub – qiskit-community/qiskit-machine-learning: An open-source library built on Qiskit for quantum machine learning tasks at scale on quantum hardware and classical simulators,” GitHub, Dec. 24, 2025. https://github.com/qiskit-community/qiskit-machine-learning (accessed Feb. 02, 2026).
What is Qiskit Machine Learning?
Qiskit Machine Learning is an open-source library built on top of the core Qiskit framework that integrates quantum computing with classical machine learning. It acts as a bridge, allowing researchers and developers to use quantum algorithms for tasks like classification, regression, and clustering.
Instead of manually building every gate, Qiskit Machine Learning provides high-level “building blocks” that plug directly into standard data science workflows.
Key Components
- Quantum Kernels: These use a quantum computer to map data into a high-dimensional “feature space.” This can help a classical algorithm (like a Support Vector Machine) find patterns that are too complex for a regular computer to see.
- Quantum Neural Networks (QNNs): These are parameterized quantum circuits that behave like classical neural networks. They can be trained to recognize data patterns by adjusting the “weights” (rotation angles) of the gates.
- Integration with PyTorch: Through the
TorchConnector, you can combine quantum layers with classical deep learning models, creating a hybrid quantum-classical model.
Example: Variational Quantum Classifier (VQC)
The VQC is one of the most popular algorithms in the library. It uses a “Feature Map” to encode your data into quantum states and an “Ansatz” (a trainable circuit) to learn the classification boundary.
Below is a concise example of how to set up and train a VQC using the latest Qiskit Machine Learning syntax:
!pip install qiskit qiskit_machine_learning
Requirement already satisfied: qiskit in /usr/local/lib/python3.12/dist-packages (2.3.0) Collecting qiskit_machine_learning Downloading qiskit_machine_learning-0.9.0-py3-none-any.whl.metadata (13 kB) Requirement already satisfied: rustworkx>=0.15.0 in /usr/local/lib/python3.12/dist-packages (from qiskit) (0.17.1) Requirement already satisfied: numpy<3,>=1.17 in /usr/local/lib/python3.12/dist-packages (from qiskit) (2.0.2) Requirement already satisfied: scipy>=1.5 in /usr/local/lib/python3.12/dist-packages (from qiskit) (1.16.3) Requirement already satisfied: dill>=0.3 in /usr/local/lib/python3.12/dist-packages (from qiskit) (0.3.8) Requirement already satisfied: stevedore>=3.0.0 in /usr/local/lib/python3.12/dist-packages (from qiskit) (5.6.0) Requirement already satisfied: typing-extensions in /usr/local/lib/python3.12/dist-packages (from qiskit) (4.15.0) Requirement already satisfied: scikit-learn>=1.2 in /usr/local/lib/python3.12/dist-packages (from qiskit_machine_learning) (1.6.1) Requirement already satisfied: setuptools>=40.1 in /usr/local/lib/python3.12/dist-packages (from qiskit_machine_learning) (75.2.0) Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.12/dist-packages (from scikit-learn>=1.2->qiskit_machine_learning) (1.5.3) Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.12/dist-packages (from scikit-learn>=1.2->qiskit_machine_learning) (3.6.0) Downloading qiskit_machine_learning-0.9.0-py3-none-any.whl (263 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 263.1/263.1 kB 12.5 MB/s eta 0:00:00 Installing collected packages: qiskit_machine_learning Successfully installed qiskit_machine_learning-0.9.0
from qiskit.circuit.library import zz_feature_map, real_amplitudes
from qiskit_machine_learning.algorithms import VQC
from qiskit_machine_learning.optimizers import COBYLA
from qiskit.primitives import StatevectorSampler
import numpy as np
# 1. Prepare dummy data (2 features, binary classification)
num_features = 2
train_features = np.array([[0.1, 0.5], [0.9, 0.2], [0.2, 0.8], [0.8, 0.1]])
train_labels = np.array([0, 1, 0, 1])
# We need test labels to calculate accuracy
test_features = np.array([[0.15, 0.45], [0.85, 0.15]])
test_labels = np.array([0, 1])
# 2. Define the Quantum components
# Feature Map: Encodes classical data into a quantum state
feature_map = zz_feature_map(feature_dimension=num_features, reps=1)
# Ansatz: The "brain" of the circuit with trainable parameters
ansatz = real_amplitudes(num_qubits=num_features, reps=1)
# 4. Define a Callback
loss_values = []
def callback_fn(weights, obj_value):
loss_values.append(obj_value)
print(f"Iteration {len(loss_values)}: Loss = {obj_value:.4f}")
# 3. Initialize the VQC
# We use a Sampler primitive to run the circuits
vqc = VQC(
feature_map=feature_map,
ansatz=ansatz,
loss="cross_entropy",
optimizer=COBYLA(maxiter=20),
callback=callback_fn,
sampler=StatevectorSampler()
)
# 4. Train the model
vqc.fit(train_features, train_labels)
# 5. Predict on new data
test_features = np.array([[0.15, 0.45], [0.85, 0.15]])
predictions = vqc.predict(test_features)
# 7. Predict and Print Circuit
print("\n--- Predictions ---")
print(f"Predictions: {vqc.predict(test_features)}")
# Calculating the accuracy score
train_score = vqc.score(train_features, train_labels)
test_score = vqc.score(test_features, test_labels)
print(f"Training Accuracy: {train_score * 100:.1f}%")
print(f"Testing Accuracy: {test_score * 100:.1f}%")
print("\n--- Circuit Diagram ---")
# Since it's a plain QuantumCircuit now, we just draw it
print(vqc.circuit.draw(output='text'))
WARNING:qiskit_machine_learning.neural_networks.sampler_qnn:No gradient function provided, creating a gradient function. If your Sampler requires transpilation, please provide a pass manager.
Iteration 1: Loss = 1.1840
Iteration 2: Loss = 0.9312
Iteration 3: Loss = 1.0437
Iteration 4: Loss = 1.2540
Iteration 5: Loss = 0.9964
Iteration 6: Loss = 1.0521
Iteration 7: Loss = 0.9558
Iteration 8: Loss = 1.0321
Iteration 9: Loss = 0.9830
Iteration 10: Loss = 0.9992
Iteration 11: Loss = 1.0141
Iteration 12: Loss = 0.9899
Iteration 13: Loss = 0.9586
Iteration 14: Loss = 1.0389
Iteration 15: Loss = 0.9804
Iteration 16: Loss = 0.9767
Iteration 17: Loss = 0.9728
Iteration 18: Loss = 0.9818
Iteration 19: Loss = 0.9976
Iteration 20: Loss = 1.0197
--- Predictions ---
Predictions: [0 0]
Training Accuracy: 50.0%
Testing Accuracy: 50.0%
--- Circuit Diagram ---
┌───┐┌───────────┐ ┌──────────┐»
q_0: ┤ H ├┤ P(2*x[0]) ├──■────────────────────────────────────■──┤ Ry(θ[0]) ├»
├───┤├───────────┤┌─┴─┐┌──────────────────────────────┐┌─┴─┐├──────────┤»
q_1: ┤ H ├┤ P(2*x[1]) ├┤ X ├┤ P((-π + x[0])*(-π + x[1])*2) ├┤ X ├┤ Ry(θ[1]) ├»
└───┘└───────────┘└───┘└──────────────────────────────┘└───┘└──────────┘»
« ┌──────────┐
«q_0: ──■──┤ Ry(θ[2]) ├
« ┌─┴─┐├──────────┤
«q_1: ┤ X ├┤ Ry(θ[3]) ├
« └───┘└──────────┘
Visual Represenation of the code.
graph LR
Start([Start]) --> Data[<b>Data Prep</b><br/>2 Features]
subgraph QC [Quantum Circuit]
Data --> FM[<b>Feature Map</b><br/>ZZFeatureMap]
FM --> AZ[<b>Ansatz</b><br/>RealAmplitudes]
end
QC --> VQC[<b>VQC Init</b><br/>COBYLA / Sampler]
subgraph Loop [Training Loop]
VQC --> Fit{vqc.fit}
Fit -- Update Params --> Fit
end
Fit --> Res[<b>Results</b><br/>Predict / Score]
Res --> End([End])
style Loop fill:#f9f,stroke:#333
style QC fill:#bbf,stroke:#333GithubLink:
https://github.com/computingnotes/QuantumFederatedLearning/blob/main/Qiskit_Machine_Learning.ipynb