Age | Commit message (Collapse) | Author | |
---|---|---|---|
2016-09-20 | Don't serialize shared parameters in clone() | Jonas Gehring | |
There's no need to serialize parameters that will be shared immediately afterwards. In particular, the current implementation of clone and share is inefficient for single modules in a larger network with flattened parameters since the storage holding all parameters will be serialized. Here, the write method of nn.Module is temporarily overwritten with a function that writes empty tensors for all shared parameters. | |||
2016-04-24 | module-replace | Sergey Zagoruyko | |
2016-04-07 | add argument passing to module typing shortcuts (#754) | Clément Masson | |
* add argument passing to module typing shortcuts( :cuda(...), float(...) , :double(...) ) | |||
2016-04-05 | Update Module.lua | rsfbarreira | |
2016-03-12 | Merge pull request #689 from colesbury/getParameters | Soumith Chintala | |
Assert that weights and gradWeights line up in getParameters | |||
2016-03-04 | Add _type field to nn.Module | Sam Gross | |
Currently, there is no clear way to get the 'type' of an nn.Module, although you can set the type via Module:type(<newtype>). This changes Module:type() to return the current type when called with no arguments. The module's type is stored in the _type field. | |||
2016-03-04 | Assert that weights and gradWeights line up in getParameters | Sam Gross | |
Adds a check that the parameters have the same offset as their gradients after getParameters is called. If they do not line up, then methods such as torch/optim will not work. This could happen if the sharing of weights and gradWeights do not match or, due to a bug in the implementation of getParameters, if the storages of weights and gradWeights do not closely correspond. Fix getParameters tests to always share gradWeights when sharing weights. | |||
2016-02-09 | nn.clearState | Sergey Zagoruyko | |
2016-01-21 | read and write methods for nn.Module. | Dominik Grewe | |
Allows modules that want to implement custom `read` and `write` methods to call `parent.read/write`. | |||
2016-01-12 | Revert "Don't re-flatten parameters if they are already flattened" | Soumith Chintala | |
2016-01-12 | Don't re-flatten parameters if they are already flattened | Sam Gross | |
2015-09-17 | Revert "fixing cuda getparams" | Soumith Chintala | |
2015-09-17 | fixing cuda getparams | soumith | |
2015-09-04 | nn.Module preserve type sharing semantics (#187); add nn.Module.apply | Adam Lerer | |
2015-09-04 | getParameters: improve memory efficiency, fix bug with non-compact tensors | Adam Lerer | |
2015-08-03 | Replace Module.flatten by nn.Module.flatten | Georg Ostrovski | |
2015-07-10 | Add unit tests for hessian.lua, fix bugs detected by the tests. | Andrey Golovizin | |
* Fix initialization of diagHessianBias for nn.SpatialConvolution. * Fix computing diagHessianBias for nn.SpatialFullConvolution. * Call module:forward() with the proper input before calling accGradParameters(). Without that, accDiagHessianParameters() produces incorrect results for some convolution classes. * Move duplicate code from Module.getParameters() to Module.flatten(), which is now used by both the original Module.getParameters() in Module.lua and the replacement Module.getParameters() in hessian.lua. | |||
2015-05-13 | Merge pull request #256 from colesbury/lua52 | Soumith Chintala | |
Rename unpack to table.unpack for Lua 5.2 | |||
2015-05-05 | Check for `nn.Module` and `nn.Criterion` in recursiveType. | Dominik Grewe | |
2015-05-05 | Rename unpack to table.unpack for Lua 5.2 | Sam Gross | |
Torch7 defines table.unpack to unpack if it is not defined. | |||
2015-04-28 | Make type() truly recursive. | Dominik Grewe | |
Recursively iterate over the whole table, converting each tensor to the given type. Removes need for many specialized type() functions. | |||
2015-04-24 | Nicer error message when flattening parameters with inconsistent types. | Tim Harley | |
2015-03-18 | Added 2 calls to collect garbage for getParameters | Xiang Zhang | |
2015-02-06 | Module:listModules() | Nicholas Leonard | |
2014-11-21 | Fix various unused variables in nn | Andrew Tulloch | |
2014-10-27 | Corrected getParamaters for partial views | James Kirkpatrick | |
Module:getParameters was incorrectly overwriting parameters that were partial views on larger storages. | |||
2014-08-13 | Check that a module is typed to non-nil | James Kirkpatrick | |
2014-08-05 | getParameters can return an empty tensor | Sergio Gomez | |
If parameters() is not defined or returns an empty table, the method getParameters() will return an empty tensor. | |||
2014-07-25 | changed module name as per nicholas's suggestion. | Jonathan Tompson | |
2014-07-25 | Added 'Module:findModulesByTypename' function. | Jonathan Tompson | |
Added a comment about top level container to the doc. | |||
2014-07-23 | Fixed the recursive type call. | Jonathan Tompson | |
2014-07-22 | Made module type conversion for member variables recursive (so that tables ↵ | Jonathan Tompson | |
of tensors are also converted). | |||
2014-07-06 | added Module:evaluate/training | nicholas-leonard | |
2013-06-11 | Using a default scale in Module.backward(). | Clement Farabet | |
This should not affect anything, as modules are always used within containers. Mostly important during testing. | |||
2013-03-23 | Sped up getParameters() in simple situations. | Clement Farabet | |
2012-11-29 | Improved Module:getParameters() speed when using many storages. | Ivo Danihelka | |
2012-10-21 | Fixed getParameters() for CUDA. When did that break? | Clement Farabet | |
2012-09-24 | Oups, remove useless print | Clement Farabet | |
2012-09-21 | Added an extra corner case to getParameters(). | Clement Farabet | |
This might finally fix all the possible corners. Fix by Michael Matthieu. | |||
2012-07-14 | Attempt to fix getParameters. | Clement Farabet | |
2012-04-01 | arg -> {...}, for LuaJIT. | Clement Farabet | |
2012-03-03 | Implementing a call operator for nn modules and criterions. | Clement Farabet | |
2012-02-19 | Add reset function to module.lua and Sequential.lua. CmdLine accepts ↵ | Koray Kavukcuoglu | |
ignore=false for string function | |||
2012-01-25 | initial revamp of torch7 tree | Ronan Collobert | |