diff options
author | Brian Broll <brian.broll@vanderbilt.edu> | 2017-02-22 20:25:07 +0300 |
---|---|---|
committer | GitHub <noreply@github.com> | 2017-02-22 20:25:07 +0300 |
commit | 41178564c364ee9fba8986faf32f695ac5cad22d (patch) | |
tree | a93b408cdefd6b91715d2a43fc057db99e6ebec1 | |
parent | 3e98a01ef457d486c5367cfe95ff93f51cfbd515 (diff) |
Fixed typo in docs
-rw-r--r-- | README.md | 2 |
1 files changed, 1 insertions, 1 deletions
@@ -91,7 +91,7 @@ If you don't want to convert all modules you can pass a function as the third ar It will be called at each step, with a module that is currently converted. It is meant to exclude modules i.e. if it returns `true`, they will be left untouched, otherwise they will be subject to conversion. -`Note that you cannot do backward pass when using cuDNN and when your model has batch normaliation layers and is in evaluate mode.` +`Note that you cannot do backward pass when using cuDNN and when your model has batch normalization layers and is in evaluate mode.` ```lua net = nn.Sequential() |