Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/nn.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2016-09-20Don't serialize shared parameters in clone()Jonas Gehring
There's no need to serialize parameters that will be shared immediately afterwards. In particular, the current implementation of clone and share is inefficient for single modules in a larger network with flattened parameters since the storage holding all parameters will be serialized. Here, the write method of nn.Module is temporarily overwritten with a function that writes empty tensors for all shared parameters.
2016-04-24module-replaceSergey Zagoruyko
2016-04-07add argument passing to module typing shortcuts (#754)Clément Masson
* add argument passing to module typing shortcuts( :cuda(...), float(...) , :double(...) )
2016-04-05Update Module.luarsfbarreira
2016-03-12Merge pull request #689 from colesbury/getParametersSoumith Chintala
Assert that weights and gradWeights line up in getParameters
2016-03-04Add _type field to nn.ModuleSam Gross
Currently, there is no clear way to get the 'type' of an nn.Module, although you can set the type via Module:type(<newtype>). This changes Module:type() to return the current type when called with no arguments. The module's type is stored in the _type field.
2016-03-04Assert that weights and gradWeights line up in getParametersSam Gross
Adds a check that the parameters have the same offset as their gradients after getParameters is called. If they do not line up, then methods such as torch/optim will not work. This could happen if the sharing of weights and gradWeights do not match or, due to a bug in the implementation of getParameters, if the storages of weights and gradWeights do not closely correspond. Fix getParameters tests to always share gradWeights when sharing weights.
2016-02-09nn.clearStateSergey Zagoruyko
2016-01-21read and write methods for nn.Module.Dominik Grewe
Allows modules that want to implement custom `read` and `write` methods to call `parent.read/write`.
2016-01-12Revert "Don't re-flatten parameters if they are already flattened"Soumith Chintala
2016-01-12Don't re-flatten parameters if they are already flattenedSam Gross
2015-09-17Revert "fixing cuda getparams"Soumith Chintala
2015-09-17fixing cuda getparamssoumith
2015-09-04nn.Module preserve type sharing semantics (#187); add nn.Module.applyAdam Lerer
2015-09-04getParameters: improve memory efficiency, fix bug with non-compact tensorsAdam Lerer
2015-08-03Replace Module.flatten by nn.Module.flattenGeorg Ostrovski
2015-07-10Add unit tests for hessian.lua, fix bugs detected by the tests.Andrey Golovizin
* Fix initialization of diagHessianBias for nn.SpatialConvolution. * Fix computing diagHessianBias for nn.SpatialFullConvolution. * Call module:forward() with the proper input before calling accGradParameters(). Without that, accDiagHessianParameters() produces incorrect results for some convolution classes. * Move duplicate code from Module.getParameters() to Module.flatten(), which is now used by both the original Module.getParameters() in Module.lua and the replacement Module.getParameters() in hessian.lua.
2015-05-13Merge pull request #256 from colesbury/lua52Soumith Chintala
Rename unpack to table.unpack for Lua 5.2
2015-05-05Check for `nn.Module` and `nn.Criterion` in recursiveType.Dominik Grewe
2015-05-05Rename unpack to table.unpack for Lua 5.2Sam Gross
Torch7 defines table.unpack to unpack if it is not defined.
2015-04-28Make type() truly recursive.Dominik Grewe
Recursively iterate over the whole table, converting each tensor to the given type. Removes need for many specialized type() functions.
2015-04-24Nicer error message when flattening parameters with inconsistent types.Tim Harley
2015-03-18Added 2 calls to collect garbage for getParametersXiang Zhang
2015-02-06Module:listModules()Nicholas Leonard
2014-11-21Fix various unused variables in nnAndrew Tulloch
2014-10-27Corrected getParamaters for partial viewsJames Kirkpatrick
Module:getParameters was incorrectly overwriting parameters that were partial views on larger storages.
2014-08-13Check that a module is typed to non-nilJames Kirkpatrick
2014-08-05getParameters can return an empty tensorSergio Gomez
If parameters() is not defined or returns an empty table, the method getParameters() will return an empty tensor.
2014-07-25changed module name as per nicholas's suggestion.Jonathan Tompson
2014-07-25Added 'Module:findModulesByTypename' function.Jonathan Tompson
Added a comment about top level container to the doc.
2014-07-23Fixed the recursive type call.Jonathan Tompson
2014-07-22Made module type conversion for member variables recursive (so that tables ↵Jonathan Tompson
of tensors are also converted).
2014-07-06added Module:evaluate/trainingnicholas-leonard
2013-06-11Using a default scale in Module.backward().Clement Farabet
This should not affect anything, as modules are always used within containers. Mostly important during testing.
2013-03-23Sped up getParameters() in simple situations.Clement Farabet
2012-11-29Improved Module:getParameters() speed when using many storages.Ivo Danihelka
2012-10-21Fixed getParameters() for CUDA. When did that break?Clement Farabet
2012-09-24Oups, remove useless printClement Farabet
2012-09-21Added an extra corner case to getParameters().Clement Farabet
This might finally fix all the possible corners. Fix by Michael Matthieu.
2012-07-14Attempt to fix getParameters.Clement Farabet
2012-04-01arg -> {...}, for LuaJIT.Clement Farabet
2012-03-03Implementing a call operator for nn modules and criterions.Clement Farabet
2012-02-19Add reset function to module.lua and Sequential.lua. CmdLine accepts ↵Koray Kavukcuoglu
ignore=false for string function
2012-01-25initial revamp of torch7 treeRonan Collobert