Skip to content

Commit

Permalink
merge with main, changes of rst and notebooks
Browse files Browse the repository at this point in the history
  • Loading branch information
Fradm98 committed Nov 17, 2023
2 parents dea368b + 5c15ede commit 6ed1f24
Show file tree
Hide file tree
Showing 13 changed files with 298 additions and 103 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,5 @@ src/quask/core_implementation/__pycache__/*
src/quask/evaluator/__pycache__/*
src/quask/optimizer/__pycache__/*
.ipynb_checkpoints/*
.vs_code/*
.vscode/*
1 change: 1 addition & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,5 +31,6 @@ sphinx:
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
setup_py_install: true
install:
- requirements: docs/requirements.txt
2 changes: 1 addition & 1 deletion docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
furo
sphinx-rtd-theme
sphinx-rtd-theme
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = []
exclude_patterns = ['_build', '**.ipynb_checkpoints']

autodoc_mock_imports = ["skopt", "skopt.space", "django", "mushroom_rl", "opytimizer", "pennylane", "qiskit", "qiskit_ibm_runtime", "qiskit_aer"]

Expand Down
140 changes: 95 additions & 45 deletions docs/source/notebooks/quantum_0_intro.ipynb

Large diffs are not rendered by default.

58 changes: 38 additions & 20 deletions docs/source/notebooks/quantum_2_projected.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,9 @@
"id": "a9eb156b-61e8-43db-8074-8c24e908f6f1",
"metadata": {},
"source": [
"Use of projector / partial tracesmances. "
"To understand Projected Quantum Kernels we should understand the limitations\n",
"of \"traditional\" quantum kernels. These limitations have very deep implications\n",
"in the understanding of kernel methods also on a classical perspective.Use of projector / partial tracesmances. "
]
},
{
Expand All @@ -36,9 +38,13 @@
"source": [
"## Expressibility and curse of dimensionality in kernel methods\n",
"\n",
"When dealing with kernel methods, whether classical or quantum, we must exercise caution when working in high-dimensional (or even infinite-dimensional) Hilbert spaces. This is due to the fact that in high dimensions, certain unfavorable phenomena can occur, resulting in a kernel machine that, after the training phase, becomes a complex function prone to overfitting. These phenomena are explored in the [upcoming tutorial](xxx).\n",
"When approaching a ML problem we could ask if it makes sense at all to use QML techniques, such as quantum kernel methods. We understood in the last years \\[kbs21\\],\\[Hua21\\] that having a large Hilbert space where we can compute classically intractable inner products does not guarantee an advantage. But, why?\n",
"\n",
"For instance, in the classical context, the Gaussian kernel maps any $\\mathbf{x} \\in \\mathbb{R}^d$ to a multi-dimensional Gaussian distribution with an average of $\\mathbf{x}$ and a covariance matrix of $\\sigma I$. When $\\sigma$ is small, data points are mapped to different regions of this infinite-dimensional Hilbert space, and $\\kappa(\\mathbf{x}, \\mathbf{x}') \\approx 0$ for all $\\mathbf{x} \\neq \\mathbf{x}'$. To avoid this, a larger $\\sigma$ is chosen to ensure that most data points relevant to our task have some nontrivial overlap.\n",
"When dealing with kernel methods, whether classical or quantum, we must exercise caution when working in high-dimensional (or even infinite-dimensional) Hilbert spaces. This is due to the fact that in\n",
"high dimensions, the problem of generalization becomes hard, _i.e._ the trained kernel is prone to overfitting.\n",
"In turn, an exponential (in the number of features/qubits) number of datapoints are needed to learn the target function we aim to estimate. These phenomena are explored in the [upcoming tutorial](xxx).\n",
"\n",
"For instance, in the classical context, the Gaussian kernel maps any $\\mathbf{x} \\in \\mathbb{R}^d$ to a multi-dimensional Gaussian distribution with an average of $\\mathbf{x}$ and a covariance matrix of $\\sigma I$. When $\\sigma$ is small, data points are mapped to different regions of this infinite-dimensional Hilbert space, and $\\kappa(\\mathbf{x}, \\mathbf{x}') \\approx 0$ for all $\\mathbf{x} \\neq \\mathbf{x}'$. This is known as the phenonenon of _curse of dimensionality_, or _orthogonality catastrophe_. To avoid this, a larger $\\sigma$ is chosen to ensure that most data points relevant to our task have some nontrivial overlap.\n",
"\n",
"As the Hilbert space for quantum systems grows exponentially with the number of qubits $n$, similar challenges can arise when using quantum kernels. This situation occurs with expressible $U(\\cdot)$, which allows access to various regions within the Hilbert space. In such cases, similar to classical kernels, techniques must be employed to control expressibility and, consequently, the model's complexity."
]
Expand All @@ -50,7 +56,7 @@
"source": [
"### Projection as expressibility control\n",
"\n",
"The authors of \\[Hua21\\], who initially addressed the challenge of the exponential dimensionality of Hilbert space in the context of quantum kernels, have introduced the concept of _projected quantum kernels_ to mitigate this issue.\n",
"The authors of \\[Hua21\\], who initially addressed the challenge of the exponential dimensionality of Hilbert space in the context of quantum kernels, have introduced the concept of _projected quantum kernels_ to mitigate this issue. Then, in \\[kbs21\\] they proved as this projected kernel must intertwine with a correct inductive bias to obtain some good performance.\n",
"\n",
"The concept is straightforward: first, the unitary transformation $U$ maps classical data into the Hilbert space of the quantum system. Subsequently, a projection maps these elements back to a lower-dimensional Hilbert space. The overall transformation, thanks to the contribution of $U$, remains beyond the capabilities of classical kernels.\n",
"\n",
Expand All @@ -73,10 +79,22 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 1,
"id": "d1d0e2df-e91a-48ed-8410-b83a580c1b56",
"metadata": {},
"outputs": [],
"outputs": [
{
"ename": "ModuleNotFoundError",
"evalue": "No module named 'quask'",
"output_type": "error",
"traceback": [
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
"\u001b[0;31mModuleNotFoundError\u001b[0m Traceback (most recent call last)",
"\u001b[1;32m/Users/fradm98/Desktop/QuASK/docs/source/notebooks/quantum_2_projected.ipynb Cell 7\u001b[0m line \u001b[0;36m1\n\u001b[0;32m----> <a href='vscode-notebook-cell:/Users/fradm98/Desktop/QuASK/docs/source/notebooks/quantum_2_projected.ipynb#W6sZmlsZQ%3D%3D?line=0'>1</a>\u001b[0m \u001b[39mfrom\u001b[39;00m \u001b[39mquask\u001b[39;00m\u001b[39m.\u001b[39;00m\u001b[39mcore\u001b[39;00m \u001b[39mimport\u001b[39;00m Ansatz, Kernel, KernelFactory, KernelType\n\u001b[1;32m <a href='vscode-notebook-cell:/Users/fradm98/Desktop/QuASK/docs/source/notebooks/quantum_2_projected.ipynb#W6sZmlsZQ%3D%3D?line=2'>3</a>\u001b[0m N_FEATURES \u001b[39m=\u001b[39m \u001b[39m2\u001b[39m\n\u001b[1;32m <a href='vscode-notebook-cell:/Users/fradm98/Desktop/QuASK/docs/source/notebooks/quantum_2_projected.ipynb#W6sZmlsZQ%3D%3D?line=3'>4</a>\u001b[0m N_OPERATIONS \u001b[39m=\u001b[39m \u001b[39m3\u001b[39m\n",
"\u001b[0;31mModuleNotFoundError\u001b[0m: No module named 'quask'"
]
}
],
"source": [
"from quask.core import Ansatz, Kernel, KernelFactory, KernelType\n",
"\n",
Expand All @@ -95,9 +113,9 @@
"id": "60a8e11d-4858-46fe-9ea7-7198d905092d",
"metadata": {},
"source": [
"Now, by employing the SWAP test over a subset of the $n$ qubits, only a small and constant number of qubits are measured while the rest remain unmeasured. This calculation is equivalent to performing the inner product between partial traces of two quantum-encoded data points can be achieved. \n",
"Now, by employing the SWAP test over a subset of the $n$ qubits, only a small and constant number of qubits are measured, hence projecting in the subspace relative to those qubit selected by the SWAP test. This calculation is equivalent to performing the inner product between partial traces of two quantum-encoded data points. \n",
"\n",
"In the following example, the measurement is performed only on the first of two qubits.\n"
"In the following example, the measurement is performed only on the first of two qubits."
]
},
{
Expand Down Expand Up @@ -238,17 +256,17 @@
"id": "c2e7e570-460c-47c0-8519-dc44698c4f54",
"metadata": {},
"source": [
"### Geometry Test\r\n",
"\r\n",
"The geometry test, introduced by [Hua21], serves as a means to assess whether a particular dataset holds the potential for a quantum advantage or if such an advantage is unlikely. The test operates as follows:\r\n",
"\r\n",
"- When $g(K_C, K_Q) \\ll \\sqrt{m}$, a classical kernel exhibits behavior similar to the quantum kernel, rendering the use of the quantum kernel redundant.\r\n",
"\r\n",
"- When $g(K_C, K_Q) \\approx \\sqrt{m}$, the quantum kernel significantly deviates from all tested classical kernels. The outcome depends on the complexity of classical kernel machines:\r\n",
" - If the complexity of any classical kernel machine is low ($s_{K_C} \\ll m$), classical kernels perform well, and the quantum kernel's divergence from classical $K_C$, doesn't yield superior performance.\r\n",
" - When the complexity of all classical kernel machines is high ($s_{K_C} \\approx m$), classical models struggle to learn the function $f$. In this scenario:\r\n",
" - If the quantum model's complexity is low ($s_{K_Q} \\ll m$), the quantum kernel successfully solves the task while the classical models do not.\r\n",
" - If the quantum model's complexity is high ($s_{K_Q} \\approx m$), even the quantum model struggles to solve the problem.\r\n"
"### Geometry Test\n",
"\n",
"The geometry test, introduced by [Hua21], serves as a means to assess whether a particular dataset holds the potential for a quantum advantage or if such an advantage is unlikely. The test operates as follows:\n",
"\n",
"- When $g(K_C, K_Q) \\ll \\sqrt{m}$, a classical kernel exhibits behavior similar to the quantum kernel, rendering the use of the quantum kernel redundant.\n",
"\n",
"- When $g(K_C, K_Q) \\approx \\sqrt{m}$, the quantum kernel significantly deviates from all tested classical kernels. The outcome depends on the complexity of classical kernel machines:\n",
" - If the complexity of any classical kernel machine is low ($s_{K_C} \\ll m$), classical kernels perform well, and the quantum kernel's divergence from classical $K_C$, doesn't yield superior performance.\n",
" - When the complexity of all classical kernel machines is high ($s_{K_C} \\approx m$), classical models struggle to learn the function $f$. In this scenario:\n",
" - If the quantum model's complexity is low ($s_{K_Q} \\ll m$), the quantum kernel successfully solves the task while the classical models do not.\n",
" - If the quantum model's complexity is high ($s_{K_Q} \\approx m$), even the quantum model struggles to solve the problem.\n"
]
},
{
Expand Down Expand Up @@ -288,7 +306,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.12.0"
}
},
"nbformat": 4,
Expand Down
Binary file added docs/source/tutorials_quantum/overlap_test.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 6ed1f24

Please sign in to comment.