Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/optim.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2017-11-28Fixed the link to the Adam research paperProGamerGov
Fixed the link to the, "Adam: A Method for Stochastic Optimization" research paper. This link no longer works: http://arxiv.org/pdf/1412.6980.pdf I and many others involved with machine learning, find it's better to link to the research paper's arXiv page itself, and not directly to the PDF file. This is because it's not easy to get to the research paper's arXiv page, directly from the PDF, but it is easy to get to the PDF from the arXiv page.
2016-07-21Add learningRateDecay to AdamCadene
2016-06-30Fix bad alignment, trailing spaces and tabsAlfredo Canziani
2016-06-10add weight decay support to adamgcheron
2015-07-30remove redundant lambda param for adamKashif Rasul
v8 of the paper does not use this parameter
2015-03-06Update adam.luaAjay Talati
2015-03-06Update adam.luaAjay Talati
2015-03-06Update adam.luaAjay Talati
2015-03-06Update adam.luaAjay Talati
2015-03-06Fixed sign in update rule & default parameters Ajay Talati
The sign in the update rule in version 4 (3rd March) of the paper does'nt seem to work? The code works when update to the parameters is positive not negative?
2015-02-26Fixed optim.adam.Ivo Danihelka
The sqrt(v) is now being used instead of pow(v, 2). And I used the default values from the Adam paper: learningRate=2e-4, epsilon=1e-8, lambda=1-1e-8.
2015-02-19lambda default fixlouissmit
2015-02-03optim style commentslouissmit
2015-02-03added adam to init.lua + adam commentslouissmit
2015-02-02adamlouissmit