Age | Commit message (Collapse) | Author | |
---|---|---|---|
2016-07-27 | fix for :type() to typecast the children as welltypefix | soumith | |
2016-07-27 | type fix for forwardnodes | Soumith Chintala | |
2016-06-07 | Quick fix for type. (#118) | Jonathan Tompson | |
2016-04-28 | Add :replace() for gModule | Adam Paszke | |
2016-04-24 | Returning the module's type to conform to torch/nn#691 | Rohan Padhye | |
2016-03-09 | Improved error message (expecting table of inputs) | Marek | |
I think a better error message. Currently if I pass in 2 comma separated inputs the error is: "Error, expecting 2 inputs". After this change it will be "Error, expecting table of 2 inputs.". | |||
2016-03-08 | Merge pull request #103 from fbesse/gradoutput_zero_optim | Soumith Chintala | |
Added optimisation to bypass the buffer allocation when all but one g… | |||
2016-03-03 | Clear tensors in a whole graph on :clearState() | Adam Paszke | |
2016-02-16 | Added optimisation to bypass the buffer allocation when all but one ↵ | Frederic Besse | |
gradOutput are zero one-element tensors. | |||
2016-02-13 | include and paths.dofile -> requirerequire | Soumith Chintala | |
2016-01-25 | Merge pull request #98 from malcolmreynolds/remove_zeroing_optimisation | koray kavukcuoglu | |
Don't bother filling a Tensor with zero right before we copy into it | |||
2016-01-25 | Don't bother filling a Tensor with zero right before we copy into it | Malcolm Reynolds | |
2016-01-20 | Store a reverse mapping when wiring together graph, detect unused nodes. | Malcolm Reynolds | |
The connectivity checking code was previously unable to detect the following error case: local input = nn.Identity()() local usedOutput = nn.Linear(20, 10)(input) local unusedOutput = nn.Linear(20, 10)(input) local gmod = nn.gModule({input}, {usedOutput}) With this fix, when gModule is called it will throw an error, because of unusedOutput. This is a backwards incompatible change, but I feel that the current flexibility is error prone, and I can't see any advantage to it. We have flushed out a couple of bugs in internal code with this change. | |||
2015-11-20 | Fix for Lua 5.2+ which removed table.maxn | Malcolm Reynolds | |
2015-11-19 | Make error messages clearer, disallow empty table in inputs. | Malcolm Reynolds | |
This is intended to address the common class of errors I see where people make a mistake connecting up their modules, but the error message is either unclear, or doesn't point towards where the mistake actually is. The 'what is this in the input' is now explicit about what the problem is, and if people pass in a nn.Module (meaning they probably forgot a set of parentheses) instead of a nngraph.Node, we say this explicitly. The '1 of split(2) outputs unused' (which previously provided no information about which split was incorrect) now includes file / line number of both the place where the Node was constructed, and the place where :split() was called. Hopefully this should reduce debugging time drastically. Finally, I have disallow passing an empty table as the input connections, ie 'nn.Identity()({})' will error. I cannot see a use case for this (if you have no input connections, just leave the second parens empty). The risk of this is when people do 'nn.Identity()({variableWithTypo})', thinking they have made a connection but actually they haven't. This is likely to cause errors much later on, whereas with this commit it errors straight away. This *could* break existing code, but theres an easy to apply fix that needs to be done at each callsite. Koray has approved this restriction to the API, but I appreciate others may have a view here.. | |||
2015-10-16 | Initialize modules table after read if necessary. | Dominik Grewe | |
2015-10-16 | Use nn.Container as base class for gModule | Andreas Köpf | |
Added modules to container in ctor, removed redundant methods training(), evaluate(), share(), zeroGradParameters(), parameters(), clone() which are now provided by the base classes (nn.gModule -> nn.Container -> nn.Module). | |||
2015-10-14 | Adding a :applyToModules() method to gModule, and :training() and ↵ | Yori Zwols | |
:evaluate() should be applied 'self' as well. | |||
2015-10-01 | Integrate apply() and type() improvements from ↵ | Adam Lerer | |
https://github.com/torch/nn/pull/303 | |||
2015-09-11 | make sure forward/backward runs can deal with parameter nodes since theynnop | Koray Kavukcuoglu | |
do not have any inputs coming in. add a display function that does not use qt, but browser | |||
2015-09-11 | support for parameter nodes | koray kavukcuoglu | |
2015-09-07 | Replace utils.istensor with torch.isTensor. | Dan Horgan | |
torch.isTensor is more precise. | |||
2015-09-04 | Whitespace cleanup. | Clement Farabet | |
2015-07-23 | Added an assert to check the number of inputs to a split. | Ivo Danihelka | |
2015-07-10 | convert node.data.gradOutputBuffer on type change | Hugh Perkins | |
2015-06-16 | adding share,map,clone | soumith | |
2015-06-09 | Added share function for gModule | Xiang Zhang | |
2015-05-20 | Checked that gModule inputs and outputs are Nodes. | Ivo Danihelka | |
2015-03-27 | Added __tostring__ to gModule. | Ivo Danihelka | |
2015-03-12 | Include all data in type conversion when type() is called on an nn.gModule | Yori Zwols | |
2015-02-24 | This makes gModules behave consistently with nn.Modules and nn.Container, ↵ | etg | |
and solves the issues of trying to change the behaviour of layers like nn.Dropout which may be in nested gModules. | |||
2014-02-17 | Replaced bfs() usage by iterating over the forwardnodes. | Ivo Danihelka | |
2014-02-13 | Merge pull request #20 from fidlej/topic_mapindex_label | koray kavukcuoglu | |
Displayed the node ids of the input nodes in the mapindex | |||
2014-02-12 | Added backward compatibity for graph with no self.nInputs. | Ivo Danihelka | |
2014-02-12 | Displayed the node ids of the input nodes in the mapindex. | Ivo Danihelka | |
2014-02-10 | Checked that no split output is unused. | Ivo Danihelka | |
2013-09-17 | Checked unused inputs.0.1 | Ivo Danihelka | |
2013-09-01 | Ensured the needed gradOutputBuffer size. | Ivo Danihelka | |
2013-09-01 | Allowed the gradInputs to be tables with tensors. | Ivo Danihelka | |
2013-07-30 | Added gModule:zeroGradParameters | Andreas Fidjeland | |
2013-07-26 | Implemented custom gModule:type(). | Ivo Danihelka | |
2013-07-19 | Disallowed to split a tensor. | Ivo Danihelka | |
2013-07-18 | Mentioned the used data structures. | Ivo Danihelka | |
2013-07-18 | Used split on innode. | Ivo Danihelka | |
2013-07-18 | Removed the partial support for nn.Criterion. Wrap the criterion by a module ↵ | Ivo Danihelka | |
instead. | |||
2013-07-18 | Removed the ignoring of weird nodes. | Ivo Danihelka | |
2013-07-17 | Reused the computed gradOutputBuffer. | Ivo Danihelka | |
2013-07-17 | Clarified the meaning of innode.data.gradOutput. | Ivo Danihelka | |
2013-07-17 | Kept only gradients to sum inside of the data.gradOutput. | Ivo Danihelka | |
2013-07-17 | Used common code to get the gradOutput. | Ivo Danihelka | |