Skip to content

Commit 3787771

Browse files
committed
Check OK
Initialized github
1 parent 4d55415 commit 3787771

File tree

7 files changed

+22
-22
lines changed

7 files changed

+22
-22
lines changed

NAMESPACE

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,8 @@ exportPattern(s2net,
66
s2Fista,
77
s2netR,
88
simulate_extra,
9-
simulate_groups)
9+
simulate_groups,
10+
transform_ExtJT)
1011

1112
S3method(print, s2Data)
1213
S3method(print, s2Fista)

README.Rmd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ knitr::opts_chunk$set(
1919

2020
R package `s2net`
2121

22-
+ Our method extends the supervised elastic-net problem, and thus it is an ideal solution to the problem of feature selection in semi-supervised contexts.
22+
+ Our method extends the supervised elastic-net problem, and thus it is a practical solution to the problem of feature selection in semi-supervised contexts.
2323
+ Its mathematical formulation is presented from a general perspective, covering a wide range of models.
2424
+ We develop a flexible and fast implementation for `s2net` in `R`, written in `C++` using `RcppArmadillo` and integrated into `R` via `Rcpp` modules.
2525

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
R package `s2net`
99

1010
- Our method extends the supervised elastic-net problem, and thus it
11-
is an ideal solution to the problem of feature selection in
11+
is a practical solution to the problem of feature selection in
1212
semi-supervised contexts.
1313
- Its mathematical formulation is presented from a general
1414
perspective, covering a wide range of models.

man/figures/s2net.png

-4.75 KB
Loading

src/transformations.h

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@
66
#define TYPE_TRANSFORM_ExtJT 1
77

88

9-
109
arma::mat transform_ExtJT(const arma::mat & X, double gamma2, double gamma3){
1110
int n = X.n_rows;
1211
int p = X.n_cols;

tests/testthat/test.optimization.R

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ lm_test = function(train){
77
lm_fit = lm.fit(x = train$xL, y = train$yL)
88
true_beta = unname(lm_fit$coefficients)
99
found_beta = as.vector(obj$beta)
10-
expect_equal(found_beta, true_beta, tolerance = .001)
10+
expect_equal(found_beta, true_beta, tolerance = .01)
1111

1212
lm_error = mean((train$xL%*%lm_fit$coefficients - train$yL)^2)
1313
error = mean((obj$intercept + train$xL%*%obj$beta - train$yL)^2)

vignettes/supervised.Rmd

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
---
2-
title: "The supervised `nExtJT`"
2+
title: "The supervised `s2net`"
33
output: rmarkdown::html_vignette
44
vignette: >
5-
%\VignetteIndexEntry{The supervised `nExtJT`}
5+
%\VignetteIndexEntry{The supervised `s2net`}
66
%\VignetteEngine{knitr::rmarkdown}
77
%\VignetteEncoding{UTF-8}
88
---
@@ -17,18 +17,18 @@ knitr::opts_chunk$set(
1717
)
1818
```
1919

20-
`nExtJT` can be used as a supervised method (without unlabeled data) and it is equialent to elastic net.
20+
`s2net` can be used as a supervised method (without unlabeled data) and it is equialent to elastic net.
2121

2222
## Data
2323

24-
The `auto_mpg` dataset is available when `nExtJT` is installed.
24+
The `auto_mpg` dataset is available when `s2net` is installed.
2525

2626
```{r}
27-
library(nExtJT)
27+
library(s2net)
2828
data("auto_mpg")
2929
30-
# Preprocess the data using the nExtData function
31-
train = nExtData(auto_mpg$P1$xL, auto_mpg$P1$yL, preprocess = TRUE)
30+
# Preprocess the data using the s2Data function
31+
train = s2Data(auto_mpg$P1$xL, auto_mpg$P1$yL, preprocess = TRUE)
3232
```
3333

3434
## Ordinary least squares
@@ -39,10 +39,10 @@ To fit an OLS model, we will use the `lm` function (without intercept).
3939
lm.fit = lm( y~ 0 + ., data = data.frame(train$xL, y = train$yL))
4040
```
4141

42-
To obtain the estimations from `nExtJT` we use
42+
To obtain the estimations from `s2net` we use
4343

4444
```{r}
45-
obj = nExtJTR(train, nExtParams(0))
45+
obj = s2netR(train, s2Params(0))
4646
# We set all the hyper-parameters to 0
4747
```
4848

@@ -56,11 +56,11 @@ print("OLS error:")
5656
mse(ypred, train$yL)
5757
5858
ypred = predict(obj, train$xL)
59-
print("nExtJT error:")
59+
print("s2net error:")
6060
mse(ypred, train$yL)
6161
6262
#Estimations
63-
data.frame(mle = lm.fit$coefficients, nExtJT = obj$beta)
63+
data.frame(mle = lm.fit$coefficients, s2net = obj$beta)
6464
```
6565

6666
## Lasso
@@ -74,13 +74,13 @@ ypred = predict(lasso.fit, train$xL)
7474
print("Lasso error:")
7575
mse(ypred, train$yL)
7676
77-
obj = nExtJTR(train, nExtParams(lambda1 = 0.01))
77+
obj = s2netR(train, s2Params(lambda1 = 0.01))
7878
ypred = predict(obj, train$xL)
79-
print("nExtJT error")
79+
print("s2net error")
8080
mse(ypred, train$yL)
8181
8282
print("Coefficients")
83-
data.frame(lasso = as.numeric(lasso.fit$beta), nExtJT = obj$beta)
83+
data.frame(lasso = as.numeric(lasso.fit$beta), s2net = obj$beta)
8484
```
8585

8686
## Elastic net
@@ -92,11 +92,11 @@ ypred = predict(enet.fit, train$xL)
9292
print("glmnet error")
9393
mse(ypred, train$yL)
9494
95-
obj = nExtJTR(train, nExtParams(lambda1 = 0.01, lambda2 = 0.01))
95+
obj = s2netR(train, s2Params(lambda1 = 0.01, lambda2 = 0.01))
9696
ypred = predict(obj, train$xL)
97-
print("nExtJT error")
97+
print("s2net error")
9898
mse(ypred, train$yL)
9999
100100
print("Coefficients")
101-
data.frame(enet = as.matrix(enet.fit$beta), nExtJT = obj$beta)
101+
data.frame(enet = as.matrix(enet.fit$beta), s2net = obj$beta)
102102
```

0 commit comments

Comments
 (0)