diff --git a/index.Rmd b/index.Rmd index 2d83defbd..f77c1042b 100644 --- a/index.Rmd +++ b/index.Rmd @@ -214,7 +214,7 @@ Keras provides a vocabulary for building deep learning models that is simple, el To learn the basics of Keras, we recommend the following sequence of tutorials: -- [Basic Classification](articles/tutorial_basic_classfication.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. +- [Basic Classification](articles/tutorial_basic_classification.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. - [Text Classification](articles/tutorial_basic_text_classification.html) --- This tutorial classifies movie reviews as positive or negative using the text of the review. diff --git a/vignettes/getting_started.Rmd b/vignettes/getting_started.Rmd index 54667ec1a..3286d92f0 100644 --- a/vignettes/getting_started.Rmd +++ b/vignettes/getting_started.Rmd @@ -192,7 +192,7 @@ Keras provides a vocabulary for building deep learning models that is simple, el To learn the basics of Keras, we recommend the following sequence of tutorials: -- [Basic Classification](tutorial_basic_classfication.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. +- [Basic Classification](tutorial_basic_classification.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. - [Text Classification](tutorial_basic_text_classification.html) --- This tutorial classifies movie reviews as positive or negative using the text of the review. diff --git a/vignettes/tutorial_basic_regression.Rmd b/vignettes/tutorial_basic_regression.Rmd index 4355758bf..89807a507 100644 --- a/vignettes/tutorial_basic_regression.Rmd +++ b/vignettes/tutorial_basic_regression.Rmd @@ -315,7 +315,7 @@ This notebook introduced a few techniques to handle a regression problem. Check out these additional tutorials to learn more: -- [Basic Classification](tutorial_basic_classfication.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. +- [Basic Classification](tutorial_basic_classification.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. - [Text Classification](tutorial_basic_text_classification.html) --- This tutorial classifies movie reviews as positive or negative using the text of the review. diff --git a/vignettes/tutorial_basic_text_classification.Rmd b/vignettes/tutorial_basic_text_classification.Rmd index 94e8ab74d..aadc87ae5 100644 --- a/vignettes/tutorial_basic_text_classification.Rmd +++ b/vignettes/tutorial_basic_text_classification.Rmd @@ -471,7 +471,7 @@ For this particular case, we could prevent overfitting by simply stopping the tr Check out these additional tutorials to learn more: -- [Basic Classification](tutorial_basic_classfication.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. +- [Basic Classification](tutorial_basic_classification.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. - [Basic Regression](tutorial_basic_regression.html) --- This tutorial builds a model to predict the median price of homes in a Boston suburb during the mid-1970s. diff --git a/vignettes/tutorial_overfit_underfit.Rmd b/vignettes/tutorial_overfit_underfit.Rmd index 5d89da537..1ce44fc44 100644 --- a/vignettes/tutorial_overfit_underfit.Rmd +++ b/vignettes/tutorial_overfit_underfit.Rmd @@ -410,7 +410,7 @@ And two important approaches not covered in this guide are data augmentation and Check out these additional tutorials to learn more: -- [Basic Classification](tutorial_basic_classfication.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. +- [Basic Classification](tutorial_basic_classification.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. - [Text Classification](tutorial_basic_text_classification.html) --- This tutorial classifies movie reviews as positive or negative using the text of the review. diff --git a/vignettes/tutorial_save_and_restore.Rmd b/vignettes/tutorial_save_and_restore.Rmd index ccedf42b0..5b4f4dee4 100644 --- a/vignettes/tutorial_save_and_restore.Rmd +++ b/vignettes/tutorial_save_and_restore.Rmd @@ -309,7 +309,7 @@ In this case, weights were saved on all epochs but the 6th and 7th, where valida Check out these additional tutorials to learn more: -- [Basic Classification](tutorial_basic_classfication.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. +- [Basic Classification](tutorial_basic_classification.html) --- In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts. - [Text Classification](tutorial_basic_text_classification.html) --- This tutorial classifies movie reviews as positive or negative using the text of the review. diff --git a/website/articles/examples/lstm_seq2seq.html b/website/articles/examples/lstm_seq2seq.html index 935835a8b..727a151fe 100644 --- a/website/articles/examples/lstm_seq2seq.html +++ b/website/articles/examples/lstm_seq2seq.html @@ -194,11 +194,11 @@

lstm_seq2seq

latent_dim = 256 # Latent dimensionality of the encoding space. num_samples = 10000 # Number of samples to train on. -## Path to the data txt file on disk. +## Path to the data txt file on disk. data_path = 'fra.txt' text <- fread(data_path, sep="\t", header=FALSE, nrows=num_samples) -## Vectorize the data. +## Vectorize the data. input_texts <- text[[1]] target_texts <- paste0('\t',text[[2]],'\n') input_texts <- lapply( input_texts, function(s) strsplit(s, split="")[[1]]) @@ -236,54 +236,54 @@

lstm_seq2seq

d3 <- sapply( target_characters, function(x) { as.integer(x == target_texts[[i]][-1]) }) decoder_target_data[i,1:nrow(d3),] <- d3 } -
## Create the model
-
## Define an input sequence and process it.
+
+
## Define an input sequence and process it.
 encoder_inputs  <- layer_input(shape=list(NULL,num_encoder_tokens))
 encoder         <- layer_lstm(units=latent_dim, return_state=TRUE)
 encoder_results <- encoder_inputs %>% encoder
-## We discard `encoder_outputs` and only keep the states.
+## We discard `encoder_outputs` and only keep the states.
 encoder_states  <- encoder_results[2:3]
 
-## Set up the decoder, using `encoder_states` as initial state.
+## Set up the decoder, using `encoder_states` as initial state.
 decoder_inputs  <- layer_input(shape=list(NULL, num_decoder_tokens))
-## We set up our decoder to return full output sequences,
-## and to return internal states as well. We don't use the
-## return states in the training model, but we will use them in inference.
+## We set up our decoder to return full output sequences,
+## and to return internal states as well. We don't use the
+## return states in the training model, but we will use them in inference.
 decoder_lstm    <- layer_lstm(units=latent_dim, return_sequences=TRUE,
                               return_state=TRUE, stateful=FALSE)
 decoder_results <- decoder_lstm(decoder_inputs, initial_state=encoder_states)
 decoder_dense   <- layer_dense(units=num_decoder_tokens, activation='softmax')
 decoder_outputs <- decoder_dense(decoder_results[[1]])
 
-## Define the model that will turn
-## `encoder_input_data` & `decoder_input_data` into `decoder_target_data`
+## Define the model that will turn
+## `encoder_input_data` & `decoder_input_data` into `decoder_target_data`
 model <- keras_model( inputs = list(encoder_inputs, decoder_inputs),
                       outputs = decoder_outputs )
 
-## Compile model
+## Compile model
 model %>% compile(optimizer='rmsprop', loss='categorical_crossentropy')
 
-## Run model
+## Run model
 model %>% fit( list(encoder_input_data, decoder_input_data), decoder_target_data,
                batch_size=batch_size,
                epochs=epochs,
                validation_split=0.2)
 
-## Save model
+## Save model
 save_model_hdf5(model,'s2s.h5')
 save_model_weights_hdf5(model,'s2s-wt.h5')
 
-##model <- load_model_hdf5('s2s.h5')
-##load_model_weights_hdf5(model,'s2s-wt.h5')
- - + +
## Here's the drill:
+## 1) encode input and retrieve initial decoder state
+## 2) run one step of decoder with this initial state
+## and a "start of sequence" token as target.
+## Output will be the next target token
+## 3) Repeat with the current target token and current states
 
-## Define sampling models
+## Define sampling models
 encoder_model <-  keras_model(encoder_inputs, encoder_states)
 decoder_state_input_h <- layer_input(shape=latent_dim)
 decoder_state_input_c <- layer_input(shape=latent_dim)
@@ -295,51 +295,51 @@ 

lstm_seq2seq

inputs = c(decoder_inputs, decoder_states_inputs), outputs = c(decoder_outputs, decoder_states)) -## Reverse-lookup token index to decode sequences back to -## something readable. +## Reverse-lookup token index to decode sequences back to +## something readable. reverse_input_char_index <- as.character(input_characters) reverse_target_char_index <- as.character(target_characters) decode_sequence <- function(input_seq) { - ## Encode the input as state vectors. + ## Encode the input as state vectors. states_value <- predict(encoder_model, input_seq) - ## Generate empty target sequence of length 1. + ## Generate empty target sequence of length 1. target_seq <- array(0, dim=c(1, 1, num_decoder_tokens)) - ## Populate the first character of target sequence with the start character. + ## Populate the first character of target sequence with the start character. target_seq[1, 1, target_token_index['\t']] <- 1. - ## Sampling loop for a batch of sequences - ## (to simplify, here we assume a batch of size 1). + ## Sampling loop for a batch of sequences + ## (to simplify, here we assume a batch of size 1). stop_condition = FALSE decoded_sentence = '' maxiter = max_decoder_seq_length niter = 1 while (!stop_condition && niter < maxiter) { - ## output_tokens, h, c = decoder_model.predict([target_seq] + states_value) + ## output_tokens, h, c = decoder_model.predict([target_seq] + states_value) decoder_predict <- predict(decoder_model, c(list(target_seq), states_value)) output_tokens <- decoder_predict[[1]] - ## Sample a token + ## Sample a token sampled_token_index <- which.max(output_tokens[1, 1, ]) sampled_char <- reverse_target_char_index[sampled_token_index] decoded_sentence <- paste0(decoded_sentence, sampled_char) decoded_sentence - ## Exit condition: either hit max length - ## or find stop character. + ## Exit condition: either hit max length + ## or find stop character. if (sampled_char == '\n' || length(decoded_sentence) > max_decoder_seq_length) { stop_condition = TRUE } - ## Update the target sequence (of length 1). - ## target_seq = np.zeros((1, 1, num_decoder_tokens)) + ## Update the target sequence (of length 1). + ## target_seq = np.zeros((1, 1, num_decoder_tokens)) target_seq[1, 1, ] <- 0 target_seq[1, 1, sampled_token_index] <- 1. - ## Update states + ## Update states h <- decoder_predict[[2]] c <- decoder_predict[[3]] states_value = list(h, c) @@ -349,8 +349,8 @@

lstm_seq2seq

} for (seq_index in 1:100) { - ## Take one sequence (part of the training test) - ## for trying out decoding. + ## Take one sequence (part of the training test) + ## for trying out decoding. input_seq = encoder_input_data[seq_index,,,drop=FALSE] decoded_sentence = decode_sequence(input_seq) target_sentence <- gsub("\t|\n","",paste(target_texts[[seq_index]],collapse='')) diff --git a/website/articles/examples/variational_autoencoder_deconv.html b/website/articles/examples/variational_autoencoder_deconv.html index a3fac3380..acb4f4007 100644 --- a/website/articles/examples/variational_autoencoder_deconv.html +++ b/website/articles/examples/variational_autoencoder_deconv.html @@ -162,7 +162,7 @@

variational_autoencoder_deconv

library(keras)
 K <- keras::backend()
 
-#### Parameterization ####
+#### Parameterization ####
 
 # input image dimensions
 img_rows <- 28L
@@ -185,7 +185,7 @@ 

variational_autoencoder_deconv

epochs <- 5L -#### Model Construction #### +#### Model Construction #### original_img_size <- c(img_rows, img_cols, img_chns) @@ -304,15 +304,15 @@

variational_autoencoder_deconv

k_mean(xent_loss + kl_loss) } -## variational autoencoder +## variational autoencoder vae <- keras_model(x, x_decoded_mean_squash) vae %>% compile(optimizer = "rmsprop", loss = vae_loss) summary(vae) -## encoder: model to project inputs on the latent space +## encoder: model to project inputs on the latent space encoder <- keras_model(x, z_mean) -## build a digit generator that can sample from the learned distribution +## build a digit generator that can sample from the learned distribution gen_decoder_input <- layer_input(shape = latent_dim) gen_hidden_decoded <- decoder_hidden(gen_decoder_input) gen_up_decoded <- decoder_upsample(gen_hidden_decoded) @@ -324,7 +324,7 @@

variational_autoencoder_deconv

generator <- keras_model(gen_decoder_input, gen_x_decoded_mean_squash) -#### Data Preparation #### +#### Data Preparation #### mnist <- dataset_mnist() data <- lapply(mnist, function(m) { @@ -334,7 +334,7 @@

variational_autoencoder_deconv

x_test <- data$test -#### Model Fitting #### +#### Model Fitting #### vae %>% fit( x_train, x_train, @@ -345,19 +345,19 @@

variational_autoencoder_deconv

) -#### Visualizations #### +#### Visualizations #### library(ggplot2) library(dplyr) -## display a 2D plot of the digit classes in the latent space +## display a 2D plot of the digit classes in the latent space x_test_encoded <- predict(encoder, x_test, batch_size = batch_size) x_test_encoded %>% as_data_frame() %>% mutate(class = as.factor(mnist$test$y)) %>% ggplot(aes(x = V1, y = V2, colour = class)) + geom_point() -## display a 2D manifold of the digits +## display a 2D manifold of the digits n <- 15 # figure with 15x15 digits digit_size <- 28 diff --git a/website/articles/getting_started.html b/website/articles/getting_started.html index a9a3f2c17..4e578f19d 100644 --- a/website/articles/getting_started.html +++ b/website/articles/getting_started.html @@ -279,7 +279,7 @@

Tutorials

To learn the basics of Keras, we recommend the following sequence of tutorials:

    -
  • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

  • +
  • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

  • Text Classification — This tutorial classifies movie reviews as positive or negative using the text of the review.

  • Basic Regression — This tutorial builds a model to predict the median price of homes in a Boston suburb during the mid-1970s.

  • Overfitting and Underfitting — In this tutorial, we explore two common regularization techniques (weight regularization and dropout) and use them to improve our movie review classification results.

  • diff --git a/website/articles/tutorial_basic_regression.html b/website/articles/tutorial_basic_regression.html index 27fc2fb79..669870bcb 100644 --- a/website/articles/tutorial_basic_regression.html +++ b/website/articles/tutorial_basic_regression.html @@ -371,7 +371,7 @@

    More Tutorials

    Check out these additional tutorials to learn more:

      -
    • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

    • +
    • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

    • Text Classification — This tutorial classifies movie reviews as positive or negative using the text of the review.

    • Overfitting and Underfitting — In this tutorial, we explore two common regularization techniques (weight regularization and dropout) and use them to improve our movie review classification results.

    • Save and Restore Models — This tutorial demonstrates various ways to save and share models (after as well as during training).

    • diff --git a/website/articles/tutorial_basic_text_classification.html b/website/articles/tutorial_basic_text_classification.html index 00410a64f..de1f4448b 100644 --- a/website/articles/tutorial_basic_text_classification.html +++ b/website/articles/tutorial_basic_text_classification.html @@ -493,7 +493,7 @@

      More Tutorials

      Check out these additional tutorials to learn more:

        -
      • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

      • +
      • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

      • Basic Regression — This tutorial builds a model to predict the median price of homes in a Boston suburb during the mid-1970s.

      • Overfitting and Underfitting — In this tutorial, we explore two common regularization techniques (weight regularization and dropout) and use them to improve our movie review classification results.

      • Save and Restore Models — This tutorial demonstrates various ways to save and share models (after as well as during training).

      • diff --git a/website/articles/tutorial_overfit_underfit.html b/website/articles/tutorial_overfit_underfit.html index 4583a26e7..169e223c8 100644 --- a/website/articles/tutorial_overfit_underfit.html +++ b/website/articles/tutorial_overfit_underfit.html @@ -457,7 +457,7 @@

        More Tutorials

        Check out these additional tutorials to learn more:

          -
        • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

        • +
        • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

        • Text Classification — This tutorial classifies movie reviews as positive or negative using the text of the review.

        • Basic Regression — This tutorial builds a model to predict the median price of homes in a Boston suburb during the mid-1970s.

        • Save and Restore Models — This tutorial demonstrates various ways to save and share models (after as well as during training).

        • diff --git a/website/articles/tutorial_save_and_restore.html b/website/articles/tutorial_save_and_restore.html index 06b6ef495..3c94ddc37 100644 --- a/website/articles/tutorial_save_and_restore.html +++ b/website/articles/tutorial_save_and_restore.html @@ -367,7 +367,7 @@

          More Tutorials

          Check out these additional tutorials to learn more:

            -
          • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

          • +
          • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

          • Text Classification — This tutorial classifies movie reviews as positive or negative using the text of the review.

          • Basic Regression — This tutorial builds a model to predict the median price of homes in a Boston suburb during the mid-1970s.

          • Overfitting and Underfitting — In this tutorial, we explore two common regularization techniques (weight regularization and dropout) and use them to improve our movie review classification results.

          • diff --git a/website/favicon.ico b/website/favicon.ico index cdff01c12..be319509f 100644 Binary files a/website/favicon.ico and b/website/favicon.ico differ diff --git a/website/index.html b/website/index.html index 377989a8a..e01f73823 100644 --- a/website/index.html +++ b/website/index.html @@ -296,7 +296,7 @@

            Tutorials

            To learn the basics of Keras, we recommend the following sequence of tutorials:

              -
            • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

            • +
            • Basic Classification — In this tutorial, we train a neural network model to classify images of clothing, like sneakers and shirts.

            • Text Classification — This tutorial classifies movie reviews as positive or negative using the text of the review.

            • Basic Regression — This tutorial builds a model to predict the median price of homes in a Boston suburb during the mid-1970s.

            • Overfitting and Underfitting — In this tutorial, we explore two common regularization techniques (weight regularization and dropout) and use them to improve our movie review classification results.

            • diff --git a/website/news/index.html b/website/news/index.html index bd8dd6a6d..5e90c1ab6 100644 --- a/website/news/index.html +++ b/website/news/index.html @@ -206,6 +206,7 @@

            • Support for defining custom Keras models (i.e. custom call() logic for forward pass)

            • Handle named list of model output names in metrics argument of compile()

            • New custom_metric() function for defining custom metrics in R

            • +
            • Provide typed wrapper for categorical custom metrics

            • Provide access to Python layer within R custom layers

            • Don’t convert custom layer output shape to tuple when shape is a list or tuple of other shapes

            • Re-export shape() function from tensorflow package

            • diff --git a/website/pkgdown.yml b/website/pkgdown.yml index 35f362077..3f9c841df 100644 --- a/website/pkgdown.yml +++ b/website/pkgdown.yml @@ -1,4 +1,4 @@ -pandoc: 2.2.1 +pandoc: 2.2.3.1 pkgdown: 1.1.0 pkgdown_sha: ~ articles: