Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/optim.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorMenshykov <ihor.ibm@gmail.com>2016-10-16 23:01:19 +0300
committerGitHub <noreply@github.com>2016-10-16 23:01:19 +0300
commit14b45ec3dfd75baa6981e7b105c965fb187487fd (patch)
tree866f7952e8d822a881f97386ba61d7d3f9694b92
parent9f6367cff15592db3321a2913f47dacb2abc3c3e (diff)
Update algos.md
-rw-r--r--doc/algos.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/doc/algos.md b/doc/algos.md
index a3ce681..a8dba9f 100644
--- a/doc/algos.md
+++ b/doc/algos.md
@@ -269,7 +269,7 @@ Algorithm is published in http://epubs.siam.org/doi/abs/10.1137/080716542
<a name='optim.nag'></a>
## nag(opfunc, x[, config][, state])
-An implementation of *SGD* adapted with features of *Nesterov's Accelerated Gradient method*, based on the paper "On the Importance of Initialization and Momentum in Deep Learning" (Sutsveker et. al., ICML 2013) http://www.cs.toronto.edu/~fritz/absps/momentum.pdf.
+An implementation of *SGD* adapted with features of *Nesterov's Accelerated Gradient method*, based on the paper "On the Importance of Initialization and Momentum in Deep Learning" (Sutskever et. al., ICML 2013) http://www.cs.toronto.edu/~fritz/absps/momentum.pdf.
Arguments: