Welcome to
mirror list
, hosted at
ThFree Co
, Russian Federation.
github.com/torch/optim.git - Unnamed repository; edit this file 'description' to name the repository.
index
:
github.com/torch/optim.git
assertions
conffix
master
nanfix
require
revert-113-sgd-lrs-fix
revert-63-confusion_matrix_fix
split
Unnamed repository; edit this file 'description' to name the repository.
www-data
summary
refs
log
tree
commit
diff
log msg
author
committer
range
Age
Commit message (
Expand
)
Author
2015-03-06
Update adam.lua
Ajay Talati
2015-03-06
Fixed sign in update rule & default parameters
Ajay Talati
2015-03-06
Merge pull request #51 from Atcold/patch-1
Soumith Chintala
2015-03-01
removed clone and now only initialise grad squared on first call to function
Will Williams
2015-03-01
added clapming to rmsprop lr multiplier
Will Williams
2015-02-27
Update README.md
Alfredo Canziani
2015-02-27
First commit
Ajay Talati
2015-02-27
implementation of rmsprop that uses eplison values to help stabilise optimisa...
Will Williams
2015-02-26
Merge pull request #48 from fidlej/topic_optimized_adam
koray kavukcuoglu
2015-02-26
Fixed optim.adam.
Ivo Danihelka
2015-02-21
Merge pull request #46 from louissmit/master
Soumith Chintala
2015-02-19
lambda default fix
louissmit
2015-02-15
Merge pull request #43 from juscodit/master
koray kavukcuoglu
2015-02-14
fix 2 bugs in roots() of polyinterp.lua:
juscodit
2015-02-03
Merge pull request #39 from louissmit/master
Soumith Chintala
2015-02-03
optim style comments
louissmit
2015-02-03
added adam to init.lua + adam comments
louissmit
2015-02-02
Merge pull request #38 from louissmit/master
Clement Farabet
2015-02-02
adam
louissmit
2015-01-29
Merge pull request #37 from lvdmaaten/master
Clement Farabet
2015-01-29
add function to numerically check gradient
lvdmaaten
2015-01-07
Merge pull request #36 from sagarwaghmare69/confusion_FRR_FAR
Soumith Chintala
2015-01-07
removing buggy and non-working code that was merged in haste
Soumith Chintala
2015-01-05
Added FAR/FRR computation to ConfusionMatrix.lua.
Sagar M Waghmare
2014-12-14
Merge pull request #32 from Aysegul/weightdecays
Soumith Chintala
2014-12-11
Merge pull request #35 from diz-vara/master
Soumith Chintala
2014-12-11
rockspec changes: '&&'
diz_vara
2014-12-10
Merge pull request #34 from diz-vara/master
Soumith Chintala
2014-12-10
';' removed
diz_vara
2014-11-30
copy variables into state params for type casting
Aysegul Dundar
2014-11-28
user either sets wd or wds
Aysegul Dundar
2014-11-24
more efficient way of state parameters update
Aysegul Dundar
2014-11-24
weight decay for individual param option added
Aysegul Dundar
2014-10-28
Merge pull request #31 from jjh42/require
Soumith Chintala
2014-10-24
Change dofiles to require
Jonathan Hunt
2014-10-23
Merge pull request #30 from nicholas-leonard/confusion
Soumith Chintala
2014-10-23
added test_confusion unit tests
nicholas-leonard
2014-10-22
Confusion:batchAdd supports cuda tensors
nicholas-leonard
2014-10-14
Added nag to init.
Clement Farabet
2014-10-13
add Nesterov Accelerated Gradient from @dilipkay
koray kavukcuoglu
2014-10-10
Merge pull request #26 from nicholas-leonard/confusion
Clement Farabet
2014-10-10
Added support for missing targets
Nicholas Leonard
2014-08-21
Merge pull request #23 from nicholas-leonard/master
koray kavukcuoglu
2014-08-21
Merge pull request #21 from jonathantompson/cuda_lbfgs
koray kavukcuoglu
2014-08-21
fix Confusion bug
nicholas-leonard
2014-08-20
Merge pull request #22 from skaae/master
Clement Farabet
2014-08-20
add mcc, sens, spec, to confMatrix
Søren Sønderby
2014-08-18
updated lbfgs to support cuda tensors.
Jonathan Tompson
2014-07-08
changed learning rate for adagrad test.
Clement Farabet
2014-07-08
Merge pull request #20 from soumith/master
Clement Farabet
[prev]
[next]