Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/optim.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorAndreas Fidjeland <andreas@fidjeland.io>2013-12-05 23:47:15 +0400
committerAndreas Fidjeland <andreas@fidjeland.io>2013-12-05 23:47:15 +0400
commit40372a6e043f2bce905be1966f48983a2924fc2a (patch)
tree3a00c611763e8c4d17c2a8574cbba9d15180cfac /adagrad.lua
parent6bf15e408a423b13abe81f4566bb457331581134 (diff)
Better formatting for docstrings in REPL
* fixed line-wrap issues * fixed nested list issues
Diffstat (limited to 'adagrad.lua')
-rw-r--r--adagrad.lua14
1 files changed, 7 insertions, 7 deletions
diff --git a/adagrad.lua b/adagrad.lua
index a65279f..b23715c 100644
--- a/adagrad.lua
+++ b/adagrad.lua
@@ -1,17 +1,17 @@
--[[ ADAGRAD implementation for SGD
ARGS:
-- opfunc : a function that takes a single input (X), the point of
+- `opfunc` : a function that takes a single input (X), the point of
evaluation, and returns f(X) and df/dX
-- x : the initial point
-- state : a table describing the state of the optimizer; after each
+- `x` : the initial point
+- `state` : a table describing the state of the optimizer; after each
call the state is modified
- state.learningRate : learning rate
- state.paramVariance : vector of temporal variances of parameters
+- `state.learningRate` : learning rate
+- `state.paramVariance` : vector of temporal variances of parameters
RETURN:
-- x : the new x vector
-- f(x) : the function, evaluated before the update
+- `x` : the new x vector
+- `f(x)` : the function, evaluated before the update
]]
function optim.adagrad(opfunc, x, config, state)