We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent 8f5c1e3 commit d9a63f1Copy full SHA for d9a63f1
beginner_source/nlp/deep_learning_tutorial.py
@@ -172,7 +172,7 @@
172
# attempting to do something more than just this vanilla gradient update.
173
# Many attempt to vary the learning rate based on what is happening at
174
# train time. You don't need to worry about what specifically these
175
-# algorithms are doing unless you are really interested. Torch provies
+# algorithms are doing unless you are really interested. Torch provides
176
# many in the torch.optim package, and they are all completely
177
# transparent. Using the simplest gradient update is the same as the more
178
# complicated algorithms. Trying different update algorithms and different
0 commit comments