Skip to content

Commit

Permalink
Fix quickstart code (keras-team#987)
Browse files Browse the repository at this point in the history
* Fix quickstart code

* Fix quickstart to actually run
  • Loading branch information
LukeWood authored Jul 18, 2022
1 parent 80ca0fb commit 4e5ce48
Showing 1 changed file with 28 additions and 9 deletions.
37 changes: 28 additions & 9 deletions templates/keras_cv/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,21 +31,36 @@ Create a preprocessing pipeline:

```python
import keras_cv
import tensorflow as tf
from tensorflow import keras

preprocessing_model = keras.Sequential([
keras_cv.layers.RandAugment(value_range=(0, 255))
import tensorflow_datasets as tfds

preprocessing_layers = [
keras_cv.layers.RandomResizedCrop(
target_size=(224, 224),
crop_area_factor=(0.8, 1.0),
aspect_ratio_factor=(3/4, 4/3)
),
keras_cv.layers.RandomFlip(),
keras_cv.layers.RandAugment(value_range=(0, 255)),
keras_cv.layers.CutMix(),
keras_cv.layers.MixUp()
], name="preprocessing_model")
]

def augment_data(images, labels):
labels = tf.one_hot(labels, 3)
inputs = {"images": images, "labels": labels}
for layer in preprocessing_layers:
inputs = layer(inputs)
return inputs['images'], inputs['labels']
```

Augment a `tf.data.Dataset`:

```python
dataset = dataset.map(lambda images, labels: {"images": images, "labels": labels})
dataset = dataset.map(preprocessing_model)
dataset = dataset.map(lambda inputs: (inputs["images"], inputs["labels"]))
dataset = tfds.load('rock_paper_scissors', as_supervised=True, split='train')
dataset = dataset.batch(64)
dataset = dataset.map(augment_data)
```

Create a model:
Expand All @@ -54,9 +69,13 @@ Create a model:
densenet = keras_cv.models.DenseNet121(
include_rescaling=True,
include_top=True,
num_classes=102
num_classes=3
)
densenet.compile(
loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy']
)
densenet.compile(optimizer='adam', metrics=['accuracy'])
```

Train your model:
Expand Down

0 comments on commit 4e5ce48

Please sign in to comment.