|
2 | 2 | "cells": [
|
3 | 3 | {
|
4 | 4 | "cell_type": "markdown",
|
| 5 | + "metadata": { |
| 6 | + "collapsed": false, |
| 7 | + "jupyter": { |
| 8 | + "outputs_hidden": false |
| 9 | + } |
| 10 | + }, |
5 | 11 | "source": [
|
6 | 12 | "# Time Series Clustering\n",
|
7 | 13 | "\n",
|
|
23 | 29 | "erative [16], Feature K-means [17], Feature K-medoids [17], U-shapelets [18],\n",
|
24 | 30 | "USSL [19], RSFS [20], NDFS [21], Deep learning and dimensionality reduction\n",
|
25 | 31 | "approaches see [22]"
|
26 |
| - ], |
27 |
| - "metadata": { |
28 |
| - "collapsed": false |
29 |
| - } |
| 32 | + ] |
30 | 33 | },
|
31 | 34 | {
|
32 | 35 | "cell_type": "markdown",
|
| 36 | + "metadata": { |
| 37 | + "collapsed": false, |
| 38 | + "jupyter": { |
| 39 | + "outputs_hidden": false |
| 40 | + } |
| 41 | + }, |
33 | 42 | "source": [
|
34 | 43 | "## Clustering notebooks\n",
|
35 | 44 | "\n",
|
36 |
| - "- `aeon` currently focusses on partition based approaches that use elastic distance\n", |
37 |
| - "functions. The [partition based](partitional_clustering.ipynb) note book has an\n", |
38 |
| - "overview of the funtionality in aeon.\n", |
| 45 | + "`aeon` offers a comprehensive suite of time series clustering (TSCL) algorithms, encompassing partition-based, density-based, hierarchical, deep learning, and feature-based approaches.\n", |
| 46 | + "\n", |
| 47 | + "- `aeon` has many partition-based clustering algorithms, which include TimeSeriesKMeans, KMedoids, CLARA, CLARANS, ElasticSOM, and KSpectralCentroid, leveraging elastic distance measures like DTW. The [partition-based](partitional_clustering.ipynb) notebook has an overview of the functionality in aeon.\n", |
39 | 48 | "\n",
|
40 |
| - "- `sklearn` has *density based* and *hierarchical based* clustering algorithms, and\n", |
41 |
| - "these can be used in conjunction with `aeon` elastic distances. See the [sklearn and\n", |
42 |
| - "aeon distances](../distances/sklearn_distances.ipynb) notebook.\n", |
| 49 | + "- `sklearn` has *density-based* and *hierarchical based* clustering algorithms, which can be used in conjunction with `aeon` elastic distances. See the [sklearn and aeon distances](../distances/sklearn_distances.ipynb) notebook.\n", |
43 | 50 | "\n",
|
44 |
| - "- Deep learning based TSCL is a very popular topic, and we are working on bringing\n", |
45 |
| - "deep learning functionality to `aeon`, first algorithms for [Deep learning] are\n", |
46 |
| - "COMING SOON\n", |
| 51 | + "- Bespoke feature-based TSCL algorithms are easily constructed with `aeon` transformers and `sklearn` clusterers in a pipeline. Some examples are in the\n", |
| 52 | + "[sklearn clustering]. The [feature-based](feature_based_clustering.ipynb) notebook gives an overview of the feature-based clusterers in an aeon.\n", |
47 | 53 | "\n",
|
48 |
| - "- Bespoke feature based TSCL algorithms are easily constructed with `aeon`\n", |
49 |
| - "transformers and `sklearn` clusterers in a pipeline. Some examples are in the\n", |
50 |
| - "[sklearn clustering]. We will bring the bespoke feature\n", |
51 |
| - "based clustering algorithms into `aeon` in the medium term.\n", |
| 54 | + "- Deep learning based TSCL is a very popular topic, and we have introduced many deep learning functionalities to `aeon`. Autoencoder-based models like AEFCNClusterer, AEResNetClusterer, and AEDCNNClusterer enable complex pattern discovery.\n", |
52 | 55 | "\n",
|
53 |
| - "We are in the process of extending the bake off described in [1] to include all\n", |
54 |
| - "clusterers. So far, we find medoids with MSM distance is the best performer.\n", |
| 56 | + "- `aeon` also includes averaging-based clustering algorithms, which utilize centroid-based representations of time series.\n", |
| 57 | + "\n", |
| 58 | + "\n", |
| 59 | + "We are in the process of extending the bake-off described in [1] to include all clusterers. So far, we find medoids with MSM distance is the best performer.\n", |
55 | 60 | "\n",
|
56 | 61 | "<img src=\"img/clst_cd.png\" width=\"600\" alt=\"cd_diag\">\n",
|
57 | 62 | "\n"
|
58 |
| - ], |
59 |
| - "metadata": { |
60 |
| - "collapsed": false |
61 |
| - } |
| 63 | + ] |
62 | 64 | },
|
63 | 65 | {
|
64 | 66 | "cell_type": "markdown",
|
| 67 | + "metadata": { |
| 68 | + "collapsed": false, |
| 69 | + "jupyter": { |
| 70 | + "outputs_hidden": false |
| 71 | + } |
| 72 | + }, |
65 | 73 | "source": [
|
66 | 74 | "## References\n",
|
67 | 75 | "\n",
|
|
141 | 149 | "[22] B. Lafabregue, J. Weber, P. Gancarski, and G. Forestier. End-to-end deep\n",
|
142 | 150 | "representation learning for time series clustering: a comparative study. Data Mining\n",
|
143 | 151 | "and Knowledge Discovery, 36:29—-81, 2022\n"
|
144 |
| - ], |
145 |
| - "metadata": { |
146 |
| - "collapsed": false |
147 |
| - } |
| 152 | + ] |
148 | 153 | }
|
149 | 154 | ],
|
150 | 155 | "metadata": {
|
151 | 156 | "kernelspec": {
|
152 |
| - "display_name": "Python 3", |
| 157 | + "display_name": "Python 3 (ipykernel)", |
153 | 158 | "language": "python",
|
154 | 159 | "name": "python3"
|
155 | 160 | },
|
156 | 161 | "language_info": {
|
157 | 162 | "codemirror_mode": {
|
158 | 163 | "name": "ipython",
|
159 |
| - "version": 2 |
| 164 | + "version": 3 |
160 | 165 | },
|
161 | 166 | "file_extension": ".py",
|
162 | 167 | "mimetype": "text/x-python",
|
163 | 168 | "name": "python",
|
164 | 169 | "nbconvert_exporter": "python",
|
165 |
| - "pygments_lexer": "ipython2", |
166 |
| - "version": "2.7.6" |
| 170 | + "pygments_lexer": "ipython3", |
| 171 | + "version": "3.11.10" |
167 | 172 | }
|
168 | 173 | },
|
169 | 174 | "nbformat": 4,
|
170 |
| - "nbformat_minor": 0 |
| 175 | + "nbformat_minor": 4 |
171 | 176 | }
|
0 commit comments