-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
ee63b75
commit 3e3d085
Showing
14 changed files
with
324 additions
and
125 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
#' Some surprising facts about bananas | ||
#' | ||
#' @export | ||
|
||
|
||
banana.facts <- function() | ||
{ | ||
all.facts <- list("Bananas can be found in other colors, too, including red.", | ||
"Banana plants are not trees", | ||
"The scientific name for banana is musa sapientum, which means >fruit of the wise men.<", | ||
"Bananas float in water. Go try it!", | ||
"Bananas are berries", | ||
"Humans share about 50% of our DNA with bananas.", | ||
"A banana contains naturally occurring radioactive material in the form of potassium-40.", | ||
"Banana equivalent dose (BED) is an informal measurement of ionizing radiation exposure" | ||
) | ||
cat(crayon::bgYellow("Banana fact:"),sample(all.facts,size=1)[[1]]) | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,55 @@ | ||
#' | ||
#' | ||
#' @export | ||
plotPredictions <- function(network, seq1) { | ||
|
||
Network_activate(network, seq1) | ||
ln <-Sequence_size(seq1) | ||
vals <- matrix(NA, nrow=ln, ncol=length(freqs)) | ||
trt <- matrix(NA, nrow=ln, ncol=length(freqs)) | ||
inp <- rep(NA, ln) | ||
for (i in 1:ln) { | ||
outp <- Network_get_output(network,i-1) | ||
vals[i,] <- .Call('R_swig_toValue', outp, package="bnnlib") | ||
outp <- Sequence_get_target(seq1, i-1) | ||
trt[i, ] <- .Call('R_swig_toValue', outp, package="bnnlib") | ||
inpp <- Sequence_get_input(seq1, i-1) | ||
inp[i] <- .Call('R_swig_toValue', inpp, package="bnnlib") | ||
} | ||
|
||
|
||
melted <- reshape2::melt(vals) | ||
melted$Var2 <- factor(melted$Var2) | ||
#melted <- melted[1:1000,] | ||
p1 <- ggplot(data=melted, aes(x=Var1,y=value,group=Var2,color=Var2))+geom_point()+geom_line()+ | ||
theme_minimal()+ theme(legend.position="bottom")+ ylab("Value")+xlab("Time")+ ggtitle("True Value") | ||
|
||
|
||
melted <- reshape2::melt(trt) | ||
melted$Var2 <- factor(melted$Var2) | ||
p2<- ggplot(data=melted, aes(x=Var1,y=value,group=Var2,color=Var2))+geom_point()+geom_line()+ | ||
theme_minimal()+ theme(legend.position="bottom")+ ylab("Value")+xlab("Time")+ ggtitle("Prediction") | ||
|
||
p3 <- ggplot(data=data.frame(x=1:length(inp),y=inp), aes(x=x,y=y))+geom_line()+ theme_minimal()+ | ||
ylab("Value")+xlab("Time")+ggtitle("Time Series") | ||
|
||
g_legend<-function(a.gplot){ | ||
tmp <- ggplot_gtable(ggplot_build(a.gplot)) | ||
leg <- which(sapply(tmp$grobs, function(x) x$name) == "guide-box") | ||
legend <- tmp$grobs[[leg]] | ||
return(legend) | ||
} | ||
|
||
mylegend<-g_legend(p1) | ||
|
||
|
||
|
||
|
||
pl <- grid.arrange(arrangeGrob(p1 + theme(legend.position="none"), | ||
p2 + theme(legend.position="none"), p3, | ||
nrow=3), | ||
mylegend, nrow=2,heights=c(10, 1)) | ||
|
||
return(pl) | ||
invisible() | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,15 @@ | ||
# | ||
# complementary SWIG stuff | ||
# | ||
|
||
# add missing | ||
setClass("_p_std__vectorT_std__vectorT_double_std__allocatorT_double_t_t_p_std__allocatorT_std__vectorT_double_std__allocatorT_double_t_t_p_t_t", contains = 'ExternalReference') | ||
|
||
setClass('_p_FeedforwardEnsemble', contains = c('ExternalReference','_p_Ensemble')) | ||
setClass('_p_LSTMEnsemble', contains = c('ExternalReference','_p_Ensemble')) | ||
|
||
setClass('_p_BackpropTrainer', contains = c('ExternalReference','_p_Trainer')) | ||
setClass('_p_ImprovedRPropTrainer', contains = c('ExternalReference','_p_Trainer')) | ||
|
||
setClass('_p_RPropTrainer', contains = c('ExternalReference','_p_Trainer')) | ||
setClass('_p_ARPropTrainer', contains = c('ExternalReference','_p_Trainer')) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,188 @@ | ||
--- | ||
title : "bnn: Recurrent Neural Networks with R" | ||
shorttitle : "Title" | ||
|
||
author: | ||
- name : "First Author" | ||
affiliation : "1" | ||
corresponding : yes # Define only one corresponding author | ||
address : "Postal address" | ||
email : "my@email.com" | ||
- name : "Ernst-August Doelle" | ||
affiliation : "1,2" | ||
|
||
affiliation: | ||
- id : "1" | ||
institution : "Wilhelm-Wundt-University" | ||
- id : "2" | ||
institution : "Konstanz Business School" | ||
|
||
authornote: | | ||
Add complete departmental affiliations for each author here. Each new line herein must be indented, like this line. | ||
Enter author note here. | ||
abstract: | | ||
One or two sentences providing a **basic introduction** to the field, comprehensible to a scientist in any discipline. | ||
Two to three sentences of **more detailed background**, comprehensible to scientists in related disciplines. | ||
One sentence clearly stating the **general problem** being addressed by this particular study. | ||
One sentence summarizing the main result (with the words "**here we show**" or their equivalent). | ||
Two or three sentences explaining what the **main result** reveals in direct comparison to what was thought to be the case previously, or how the main result adds to previous knowledge. | ||
One or two sentences to put the results into a more **general context**. | ||
Two or three sentences to provide a **broader perspective**, readily comprehensible to a scientist in any discipline. | ||
<!-- https://tinyurl.com/ybremelq --> | ||
keywords : "keywords" | ||
wordcount : "X" | ||
|
||
bibliography : ["r-references.bib"] | ||
|
||
floatsintext : no | ||
figurelist : no | ||
tablelist : no | ||
footnotelist : no | ||
linenumbers : yes | ||
mask : no | ||
draft : no | ||
|
||
documentclass : "apa6" | ||
classoption : "man" | ||
output : papaja::apa6_pdf | ||
--- | ||
|
||
```{r setup, include = FALSE} | ||
library("papaja") | ||
library(bnn) | ||
``` | ||
|
||
```{r analysis-preferences} | ||
# Seed for random number generation | ||
set.seed(42) | ||
knitr::opts_chunk$set(cache.extra = knitr::rand_seed) | ||
``` | ||
|
||
|
||
# Introduction | ||
|
||
Neural networks are biologically-inspired, general-purpose computational methods that serve various purposes in psychological research. First, neural networks can be used as black-box models for non-linear regression and classification problems. `bnn` is written in C++ to enable fast computation across all platforms. The R package contains wrappers that were created using SWIG and additional manual wrapper design. | ||
|
||
Neural networks | ||
|
||
# Methods | ||
|
||
First, we install the library from github and attach the package to the current workspace. | ||
|
||
```{r eval=FALSE} | ||
devtools::install_github("brandmaier/bnn") | ||
library(bnn) | ||
``` | ||
|
||
There are seven essential elements in `bnn`: Nodes, ensembles, networks, trainers, sequences, sequences sets, and error functions. | ||
|
||
```{r} | ||
LSTMNetwork() | ||
``` | ||
|
||
`bnn` provides a factory class\footnote{Factories are classes that are never instantiated but only serve to create objects.} called `NetworkFactory` that provides methods to generate a variety of standard architectures | ||
```{r} | ||
``` | ||
|
||
### Creating customized networks | ||
|
||
Customized network architectures can be created by creating and arranging ensembles into a network. Ensembles are sets of nodes (often referred to as layers) even though ensembles can also abstract complex cell types that are created from sets of cells (which in turn could be ensembles). | ||
|
||
```{r} | ||
input_layer <- FeedforwardEnsemble(TANH_NODE, 2) | ||
hidden_layer <- FeedforwardEnsemble(TANH_NODE, 10) | ||
output_layer <- FeedforwardEnsemble(LINEAR_NODE, 1) | ||
``` | ||
|
||
The layers can be arranged into a network by using the concatenation operator provided by `bnn`: | ||
```{r} | ||
network <- Network() %>% input_layer %>% hidden_layer %>% output_layer | ||
``` | ||
|
||
TODO: Washout_time (Network) | ||
|
||
## Trainer | ||
|
||
In `bnnlib`, various algorithms to fit neural networks are available including backpropagation, stochastic gradient descent, adaptive (ADAM), Resilient backpropagation (RProp), Improved resilient propagation (IRProp), root-mean-squared error propagation (RMSProp). The training algorithms can be instantiated by attaching them to an existing network. | ||
|
||
```{r} | ||
trainer <- BackpropTrainer() | ||
network %>% trainer | ||
``` | ||
|
||
There are SWIG wrapper functions to change the default behavior of the trainers. For those using a learning rate, it can be set via | ||
|
||
```{r} | ||
Trainer_learning_rate_set(0.0001) | ||
``` | ||
|
||
For those using momentum, it can be set as | ||
|
||
```{r} | ||
Trainer_momentum_set(0.01) | ||
``` | ||
|
||
To switch between batch learning and stochastic gradient descent, one can set the 'batch learning' option. If it is `TRUE`, the gradients for all sequences are computed and then a single weight change is performed. If `FALSE`, the | ||
|
||
```{r} | ||
Trainer_batch_learning_set(TRUE) | ||
``` | ||
|
||
Furthermore, a `Trainer` can have several callbacks. Callbacks are methods that are called after a given number of epochs and perform a specified action, such as saving the network, printing informative output. | ||
|
||
Last. a `Trainer` can also have one or more stopping criteria. By default, no stopping criterion is given and training always proceeds until the specified number of epochs is trained. This may result in over-fitting and it is common practice to implement some form of early stopping rule. Stopping rules include the `ConvergenceCriterion` which takes a single number as argument. If the absolute difference of the error function between two epochs is equal or smaller than that number, training is stopped. | ||
or testing whether training should be aborted prematurely (for example, because the error on a validation set starts to increase). Here are some examples of how to implement callbacks and stopping rules: | ||
|
||
```{r} | ||
Trainer_add_callback(ProgressDot()) | ||
Trainer_add_abort_criterion(ConvergenceCriterion(0.00001)) | ||
``` | ||
## Error Functions | ||
|
||
Error functions are functions that determine the error of the output nodes in a neural network as a divergence function of their activations and the target values. By default, the squared error loss is used, which is appropriate for regression problems with continuous output variables (typically associated with a linear activation function in the output layer). This error function is instantiated using the constructor `SquaredErrorFunction()`. Other error functions are the `MinkowskiErrorFunction()` implementing an absolute error function (also known as Manhattan error function). For classification, the appropriate error functions are `CrossEntropyErrorFunction()` which implements cross-entropy error that is appropriate for 0-1-coded classification tasks and `WinnerTakesAllErrorFunction()` that is appropriate for tasks with one-hot coding where the output layer is representing a discrete probability function. | ||
|
||
## Data formatting | ||
|
||
`bnnlib` is | ||
|
||
## Plotting facilities | ||
|
||
The package has a particular focus on plotting network activations over time. | ||
|
||
## Demonstrations | ||
|
||
### Frequencies Task | ||
|
||
### Grammar Task | ||
|
||
### Comparing Training Algorithms | ||
|
||
|
||
# Discussion | ||
|
||
Do we really need yet another neural networks library? | ||
|
||
\newpage | ||
|
||
# References | ||
```{r create_r-references} | ||
r_refs(file = "r-references.bib") | ||
``` | ||
|
||
\begingroup | ||
\setlength{\parindent}{-0.5in} | ||
\setlength{\leftskip}{0.5in} | ||
|
||
<div id = "refs"></div> | ||
\endgroup |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.