Skip to content

Commit d490dee

Browse files
feat: add suffixes to models to indicate their task (#588)
### Summary of Changes Add the suffix `Classifier` to all models for classification and `Regressor` to all models for regression. While being longer, this naming has many advantages: * Better **readability**: Several models have variants for classification and regressions. Previously, both had the same name, so to understand which one was used in the code, imports had to be checked. * Better **auto-completion**: Now users can simply write `Classifier` or `Regressor` to get a list of all suitable models. * Better **understandability**: Now it's obvious, that logistic regression is used for classification. --------- Co-authored-by: megalinter-bot <129584137+megalinter-bot@users.noreply.github.com>
1 parent ea176fc commit d490dee

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+370
-316
lines changed

docs/glossary.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ It classifies the predictions to be either be [true positive](#true-positive-tp)
2525
## Decision Tree
2626
A Decision Tree represents the process of conditional evaluation in a tree diagram.
2727

28-
Implemented in Safe-DS as [Decision Tree][safeds.ml.classical.classification.DecisionTree].
28+
Implemented in Safe-DS as [DecisionTreeClassifier][safeds.ml.classical.classification.DecisionTreeClassifier] and [DecisionTreeRegressor][safeds.ml.classical.regression.DecisionTreeRegressor].
2929

3030
## F1-Score
3131
The harmonic mean of [precision](#precision) and [recall](#recall). Formula:
@@ -48,7 +48,7 @@ It is analogous to a column within a table.
4848
Linear Regression is the supervised Machine Learning model in which the model finds the best fit linear line between the independent and dependent variable
4949
i.e. it finds the linear relationship between the dependent and independent variable.
5050

51-
Implemented in Safe-DS as [LinearRegression][safeds.ml.classical.regression.LinearRegression].
51+
Implemented in Safe-DS as [LinearRegression][safeds.ml.classical.regression.LinearRegressionRegressor].
5252

5353
## Machine Learning (ML)
5454
Machine Learning is a generic term for artificially generating knowledge through experience.
@@ -84,7 +84,7 @@ See here for respective references:
8484
## Random Forest
8585
Random Forest is an ML model that works by generating decision trees at random.
8686

87-
Implemented in Safe-DS as [RandomForest][safeds.ml.classical.regression.RandomForest].
87+
Implemented in Safe-DS as [RandomForestClassifier][safeds.ml.classical.classification.RandomForestClassifier] and [RandomForestRegressor][safeds.ml.classical.regression.RandomForestRegressor].
8888

8989
## Recall
9090
The ability of a [classification](#classification) model to identify all the relevant data points. Formula:

docs/tutorials/classification.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -145,9 +145,9 @@
145145
"execution_count": null,
146146
"outputs": [],
147147
"source": [
148-
"from safeds.ml.classical.classification import RandomForest\n",
148+
"from safeds.ml.classical.classification import RandomForestClassifier\n",
149149
"\n",
150-
"model = RandomForest()\n",
150+
"model = RandomForestClassifier()\n",
151151
"fitted_model= model.fit(tagged_train_table)"
152152
],
153153
"metadata": {

docs/tutorials/machine_learning.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,9 +54,9 @@
5454
"execution_count": null,
5555
"outputs": [],
5656
"source": [
57-
"from safeds.ml.classical.regression import LinearRegression\n",
57+
"from safeds.ml.classical.regression import LinearRegressionRegressor\n",
5858
"\n",
59-
"model = LinearRegression()\n",
59+
"model = LinearRegressionRegressor()\n",
6060
"fitted_model = model.fit(tagged_table)"
6161
],
6262
"metadata": {

docs/tutorials/regression.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -98,9 +98,9 @@
9898
"execution_count": null,
9999
"outputs": [],
100100
"source": [
101-
"from safeds.ml.classical.regression import DecisionTree\n",
101+
"from safeds.ml.classical.regression import DecisionTreeRegressor\n",
102102
"\n",
103-
"model = DecisionTree()\n",
103+
"model = DecisionTreeRegressor()\n",
104104
"fitted_model = model.fit(tagged_train_table)"
105105
],
106106
"metadata": {
Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,21 +1,21 @@
11
"""Classes for classification tasks."""
22

3-
from ._ada_boost import AdaBoost
3+
from ._ada_boost import AdaBoostClassifier
44
from ._classifier import Classifier
5-
from ._decision_tree import DecisionTree
6-
from ._gradient_boosting import GradientBoosting
7-
from ._k_nearest_neighbors import KNearestNeighbors
8-
from ._logistic_regression import LogisticRegression
9-
from ._random_forest import RandomForest
10-
from ._support_vector_machine import SupportVectorMachine
5+
from ._decision_tree import DecisionTreeClassifier
6+
from ._gradient_boosting import GradientBoostingClassifier
7+
from ._k_nearest_neighbors import KNearestNeighborsClassifier
8+
from ._logistic_regression import LogisticRegressionClassifier
9+
from ._random_forest import RandomForestClassifier
10+
from ._support_vector_machine import SupportVectorMachineClassifier
1111

1212
__all__ = [
13-
"AdaBoost",
13+
"AdaBoostClassifier",
1414
"Classifier",
15-
"DecisionTree",
16-
"GradientBoosting",
17-
"KNearestNeighbors",
18-
"LogisticRegression",
19-
"RandomForest",
20-
"SupportVectorMachine",
15+
"DecisionTreeClassifier",
16+
"GradientBoostingClassifier",
17+
"KNearestNeighborsClassifier",
18+
"LogisticRegressionClassifier",
19+
"RandomForestClassifier",
20+
"SupportVectorMachineClassifier",
2121
]

src/safeds/ml/classical/classification/_ada_boost.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
from safeds.data.tabular.containers import Table, TaggedTable
1616

1717

18-
class AdaBoost(Classifier):
18+
class AdaBoostClassifier(Classifier):
1919
"""
2020
Ada Boost classification.
2121
@@ -99,7 +99,7 @@ def learning_rate(self) -> float:
9999
"""
100100
return self._learning_rate
101101

102-
def fit(self, training_set: TaggedTable) -> AdaBoost:
102+
def fit(self, training_set: TaggedTable) -> AdaBoostClassifier:
103103
"""
104104
Create a copy of this classifier and fit it with the given training data.
105105
@@ -112,7 +112,7 @@ def fit(self, training_set: TaggedTable) -> AdaBoost:
112112
113113
Returns
114114
-------
115-
fitted_classifier : AdaBoost
115+
fitted_classifier : AdaBoostClassifier
116116
The fitted classifier.
117117
118118
Raises
@@ -131,7 +131,7 @@ def fit(self, training_set: TaggedTable) -> AdaBoost:
131131
wrapped_classifier = self._get_sklearn_classifier()
132132
fit(wrapped_classifier, training_set)
133133

134-
result = AdaBoost(
134+
result = AdaBoostClassifier(
135135
learner=self.learner,
136136
maximum_number_of_learners=self.maximum_number_of_learners,
137137
learning_rate=self._learning_rate,

src/safeds/ml/classical/classification/_decision_tree.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
from safeds.data.tabular.containers import Table, TaggedTable
1515

1616

17-
class DecisionTree(Classifier):
17+
class DecisionTreeClassifier(Classifier):
1818
"""Decision tree classification."""
1919

2020
def __init__(self) -> None:
@@ -23,7 +23,7 @@ def __init__(self) -> None:
2323
self._feature_names: list[str] | None = None
2424
self._target_name: str | None = None
2525

26-
def fit(self, training_set: TaggedTable) -> DecisionTree:
26+
def fit(self, training_set: TaggedTable) -> DecisionTreeClassifier:
2727
"""
2828
Create a copy of this classifier and fit it with the given training data.
2929
@@ -36,7 +36,7 @@ def fit(self, training_set: TaggedTable) -> DecisionTree:
3636
3737
Returns
3838
-------
39-
fitted_classifier : DecisionTree
39+
fitted_classifier : DecisionTreeClassifier
4040
The fitted classifier.
4141
4242
Raises
@@ -55,7 +55,7 @@ def fit(self, training_set: TaggedTable) -> DecisionTree:
5555
wrapped_classifier = self._get_sklearn_classifier()
5656
fit(wrapped_classifier, training_set)
5757

58-
result = DecisionTree()
58+
result = DecisionTreeClassifier()
5959
result._wrapped_classifier = wrapped_classifier
6060
result._feature_names = training_set.features.column_names
6161
result._target_name = training_set.target.name

src/safeds/ml/classical/classification/_gradient_boosting.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
from safeds.data.tabular.containers import Table, TaggedTable
1616

1717

18-
class GradientBoosting(Classifier):
18+
class GradientBoostingClassifier(Classifier):
1919
"""
2020
Gradient boosting classification.
2121
@@ -74,7 +74,7 @@ def learning_rate(self) -> float:
7474
"""
7575
return self._learning_rate
7676

77-
def fit(self, training_set: TaggedTable) -> GradientBoosting:
77+
def fit(self, training_set: TaggedTable) -> GradientBoostingClassifier:
7878
"""
7979
Create a copy of this classifier and fit it with the given training data.
8080
@@ -87,7 +87,7 @@ def fit(self, training_set: TaggedTable) -> GradientBoosting:
8787
8888
Returns
8989
-------
90-
fitted_classifier : GradientBoosting
90+
fitted_classifier : GradientBoostingClassifier
9191
The fitted classifier.
9292
9393
Raises
@@ -106,7 +106,7 @@ def fit(self, training_set: TaggedTable) -> GradientBoosting:
106106
wrapped_classifier = self._get_sklearn_classifier()
107107
fit(wrapped_classifier, training_set)
108108

109-
result = GradientBoosting(number_of_trees=self._number_of_trees, learning_rate=self._learning_rate)
109+
result = GradientBoostingClassifier(number_of_trees=self._number_of_trees, learning_rate=self._learning_rate)
110110
result._wrapped_classifier = wrapped_classifier
111111
result._feature_names = training_set.features.column_names
112112
result._target_name = training_set.target.name

src/safeds/ml/classical/classification/_k_nearest_neighbors.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
from safeds.data.tabular.containers import Table, TaggedTable
1616

1717

18-
class KNearestNeighbors(Classifier):
18+
class KNearestNeighborsClassifier(Classifier):
1919
"""
2020
K-nearest-neighbors classification.
2121
@@ -56,7 +56,7 @@ def number_of_neighbors(self) -> int:
5656
"""
5757
return self._number_of_neighbors
5858

59-
def fit(self, training_set: TaggedTable) -> KNearestNeighbors:
59+
def fit(self, training_set: TaggedTable) -> KNearestNeighborsClassifier:
6060
"""
6161
Create a copy of this classifier and fit it with the given training data.
6262
@@ -69,7 +69,7 @@ def fit(self, training_set: TaggedTable) -> KNearestNeighbors:
6969
7070
Returns
7171
-------
72-
fitted_classifier : KNearestNeighbors
72+
fitted_classifier : KNearestNeighborsClassifier
7373
The fitted classifier.
7474
7575
Raises
@@ -99,7 +99,7 @@ def fit(self, training_set: TaggedTable) -> KNearestNeighbors:
9999
wrapped_classifier = self._get_sklearn_classifier()
100100
fit(wrapped_classifier, training_set)
101101

102-
result = KNearestNeighbors(self._number_of_neighbors)
102+
result = KNearestNeighborsClassifier(self._number_of_neighbors)
103103
result._wrapped_classifier = wrapped_classifier
104104
result._feature_names = training_set.features.column_names
105105
result._target_name = training_set.target.name

src/safeds/ml/classical/classification/_logistic_regression.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@
1414
from safeds.data.tabular.containers import Table, TaggedTable
1515

1616

17-
class LogisticRegression(Classifier):
17+
class LogisticRegressionClassifier(Classifier):
1818
"""Regularized logistic regression."""
1919

2020
def __init__(self) -> None:
@@ -23,7 +23,7 @@ def __init__(self) -> None:
2323
self._feature_names: list[str] | None = None
2424
self._target_name: str | None = None
2525

26-
def fit(self, training_set: TaggedTable) -> LogisticRegression:
26+
def fit(self, training_set: TaggedTable) -> LogisticRegressionClassifier:
2727
"""
2828
Create a copy of this classifier and fit it with the given training data.
2929
@@ -36,7 +36,7 @@ def fit(self, training_set: TaggedTable) -> LogisticRegression:
3636
3737
Returns
3838
-------
39-
fitted_classifier : LogisticRegression
39+
fitted_classifier : LogisticRegressionClassifier
4040
The fitted classifier.
4141
4242
Raises
@@ -55,7 +55,7 @@ def fit(self, training_set: TaggedTable) -> LogisticRegression:
5555
wrapped_classifier = self._get_sklearn_classifier()
5656
fit(wrapped_classifier, training_set)
5757

58-
result = LogisticRegression()
58+
result = LogisticRegressionClassifier()
5959
result._wrapped_classifier = wrapped_classifier
6060
result._feature_names = training_set.features.column_names
6161
result._target_name = training_set.target.name

src/safeds/ml/classical/classification/_random_forest.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
from safeds.data.tabular.containers import Table, TaggedTable
1616

1717

18-
class RandomForest(Classifier):
18+
class RandomForestClassifier(Classifier):
1919
"""Random forest classification.
2020
2121
Parameters
@@ -54,7 +54,7 @@ def number_of_trees(self) -> int:
5454
"""
5555
return self._number_of_trees
5656

57-
def fit(self, training_set: TaggedTable) -> RandomForest:
57+
def fit(self, training_set: TaggedTable) -> RandomForestClassifier:
5858
"""
5959
Create a copy of this classifier and fit it with the given training data.
6060
@@ -67,7 +67,7 @@ def fit(self, training_set: TaggedTable) -> RandomForest:
6767
6868
Returns
6969
-------
70-
fitted_classifier : RandomForest
70+
fitted_classifier : RandomForestClassifier
7171
The fitted classifier.
7272
7373
Raises
@@ -86,7 +86,7 @@ def fit(self, training_set: TaggedTable) -> RandomForest:
8686
wrapped_classifier = self._get_sklearn_classifier()
8787
fit(wrapped_classifier, training_set)
8888

89-
result = RandomForest(number_of_trees=self._number_of_trees)
89+
result = RandomForestClassifier(number_of_trees=self._number_of_trees)
9090
result._wrapped_classifier = wrapped_classifier
9191
result._feature_names = training_set.features.column_names
9292
result._target_name = training_set.target.name

src/safeds/ml/classical/classification/_support_vector_machine.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ def _get_sklearn_kernel(self) -> object:
3030
"""
3131

3232

33-
class SupportVectorMachine(Classifier):
33+
class SupportVectorMachineClassifier(Classifier):
3434
"""
3535
Support vector machine.
3636
@@ -151,18 +151,18 @@ def _get_kernel_name(self) -> str:
151151
TypeError
152152
If the kernel type is invalid.
153153
"""
154-
if isinstance(self.kernel, SupportVectorMachine.Kernel.Linear):
154+
if isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.Linear):
155155
return "linear"
156-
elif isinstance(self.kernel, SupportVectorMachine.Kernel.Polynomial):
156+
elif isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.Polynomial):
157157
return "poly"
158-
elif isinstance(self.kernel, SupportVectorMachine.Kernel.Sigmoid):
158+
elif isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.Sigmoid):
159159
return "sigmoid"
160-
elif isinstance(self.kernel, SupportVectorMachine.Kernel.RadialBasisFunction):
160+
elif isinstance(self.kernel, SupportVectorMachineClassifier.Kernel.RadialBasisFunction):
161161
return "rbf"
162162
else:
163163
raise TypeError("Invalid kernel type.")
164164

165-
def fit(self, training_set: TaggedTable) -> SupportVectorMachine:
165+
def fit(self, training_set: TaggedTable) -> SupportVectorMachineClassifier:
166166
"""
167167
Create a copy of this classifier and fit it with the given training data.
168168
@@ -175,7 +175,7 @@ def fit(self, training_set: TaggedTable) -> SupportVectorMachine:
175175
176176
Returns
177177
-------
178-
fitted_classifier : SupportVectorMachine
178+
fitted_classifier : SupportVectorMachineClassifier
179179
The fitted classifier.
180180
181181
Raises
@@ -194,7 +194,7 @@ def fit(self, training_set: TaggedTable) -> SupportVectorMachine:
194194
wrapped_classifier = self._get_sklearn_classifier()
195195
fit(wrapped_classifier, training_set)
196196

197-
result = SupportVectorMachine(c=self._c, kernel=self._kernel)
197+
result = SupportVectorMachineClassifier(c=self._c, kernel=self._kernel)
198198
result._wrapped_classifier = wrapped_classifier
199199
result._feature_names = training_set.features.column_names
200200
result._target_name = training_set.target.name

0 commit comments

Comments
 (0)