Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/nngraph.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2016-07-27fix for :type() to typecast the children as welltypefixsoumith
2016-07-27type fix for forwardnodesSoumith Chintala
2016-06-07Quick fix for type. (#118)Jonathan Tompson
2016-04-28Add :replace() for gModuleAdam Paszke
2016-04-24Returning the module's type to conform to torch/nn#691Rohan Padhye
2016-03-09Improved error message (expecting table of inputs)Marek
I think a better error message. Currently if I pass in 2 comma separated inputs the error is: "Error, expecting 2 inputs". After this change it will be "Error, expecting table of 2 inputs.".
2016-03-08Merge pull request #103 from fbesse/gradoutput_zero_optimSoumith Chintala
Added optimisation to bypass the buffer allocation when all but one g…
2016-03-03Clear tensors in a whole graph on :clearState()Adam Paszke
2016-02-16Added optimisation to bypass the buffer allocation when all but one ↵Frederic Besse
gradOutput are zero one-element tensors.
2016-02-13include and paths.dofile -> requirerequireSoumith Chintala
2016-01-25Merge pull request #98 from malcolmreynolds/remove_zeroing_optimisationkoray kavukcuoglu
Don't bother filling a Tensor with zero right before we copy into it
2016-01-25Don't bother filling a Tensor with zero right before we copy into itMalcolm Reynolds
2016-01-20Store a reverse mapping when wiring together graph, detect unused nodes.Malcolm Reynolds
The connectivity checking code was previously unable to detect the following error case: local input = nn.Identity()() local usedOutput = nn.Linear(20, 10)(input) local unusedOutput = nn.Linear(20, 10)(input) local gmod = nn.gModule({input}, {usedOutput}) With this fix, when gModule is called it will throw an error, because of unusedOutput. This is a backwards incompatible change, but I feel that the current flexibility is error prone, and I can't see any advantage to it. We have flushed out a couple of bugs in internal code with this change.
2015-11-20Fix for Lua 5.2+ which removed table.maxnMalcolm Reynolds
2015-11-19Make error messages clearer, disallow empty table in inputs.Malcolm Reynolds
This is intended to address the common class of errors I see where people make a mistake connecting up their modules, but the error message is either unclear, or doesn't point towards where the mistake actually is. The 'what is this in the input' is now explicit about what the problem is, and if people pass in a nn.Module (meaning they probably forgot a set of parentheses) instead of a nngraph.Node, we say this explicitly. The '1 of split(2) outputs unused' (which previously provided no information about which split was incorrect) now includes file / line number of both the place where the Node was constructed, and the place where :split() was called. Hopefully this should reduce debugging time drastically. Finally, I have disallow passing an empty table as the input connections, ie 'nn.Identity()({})' will error. I cannot see a use case for this (if you have no input connections, just leave the second parens empty). The risk of this is when people do 'nn.Identity()({variableWithTypo})', thinking they have made a connection but actually they haven't. This is likely to cause errors much later on, whereas with this commit it errors straight away. This *could* break existing code, but theres an easy to apply fix that needs to be done at each callsite. Koray has approved this restriction to the API, but I appreciate others may have a view here..
2015-10-16Initialize modules table after read if necessary.Dominik Grewe
2015-10-16Use nn.Container as base class for gModuleAndreas Köpf
Added modules to container in ctor, removed redundant methods training(), evaluate(), share(), zeroGradParameters(), parameters(), clone() which are now provided by the base classes (nn.gModule -> nn.Container -> nn.Module).
2015-10-14Adding a :applyToModules() method to gModule, and :training() and ↵Yori Zwols
:evaluate() should be applied 'self' as well.
2015-10-01Integrate apply() and type() improvements from ↵Adam Lerer
https://github.com/torch/nn/pull/303
2015-09-11make sure forward/backward runs can deal with parameter nodes since theynnopKoray Kavukcuoglu
do not have any inputs coming in. add a display function that does not use qt, but browser
2015-09-11support for parameter nodeskoray kavukcuoglu
2015-09-07Replace utils.istensor with torch.isTensor.Dan Horgan
torch.isTensor is more precise.
2015-09-04Whitespace cleanup.Clement Farabet
2015-07-23Added an assert to check the number of inputs to a split.Ivo Danihelka
2015-07-10convert node.data.gradOutputBuffer on type changeHugh Perkins
2015-06-16adding share,map,clonesoumith
2015-06-09Added share function for gModuleXiang Zhang
2015-05-20Checked that gModule inputs and outputs are Nodes.Ivo Danihelka
2015-03-27Added __tostring__ to gModule.Ivo Danihelka
2015-03-12Include all data in type conversion when type() is called on an nn.gModuleYori Zwols
2015-02-24This makes gModules behave consistently with nn.Modules and nn.Container, ↵etg
and solves the issues of trying to change the behaviour of layers like nn.Dropout which may be in nested gModules.
2014-02-17Replaced bfs() usage by iterating over the forwardnodes.Ivo Danihelka
2014-02-13Merge pull request #20 from fidlej/topic_mapindex_labelkoray kavukcuoglu
Displayed the node ids of the input nodes in the mapindex
2014-02-12Added backward compatibity for graph with no self.nInputs.Ivo Danihelka
2014-02-12Displayed the node ids of the input nodes in the mapindex.Ivo Danihelka
2014-02-10Checked that no split output is unused.Ivo Danihelka
2013-09-17Checked unused inputs.0.1Ivo Danihelka
2013-09-01Ensured the needed gradOutputBuffer size.Ivo Danihelka
2013-09-01Allowed the gradInputs to be tables with tensors.Ivo Danihelka
2013-07-30Added gModule:zeroGradParametersAndreas Fidjeland
2013-07-26Implemented custom gModule:type().Ivo Danihelka
2013-07-19Disallowed to split a tensor.Ivo Danihelka
2013-07-18Mentioned the used data structures.Ivo Danihelka
2013-07-18Used split on innode.Ivo Danihelka
2013-07-18Removed the partial support for nn.Criterion. Wrap the criterion by a module ↵Ivo Danihelka
instead.
2013-07-18Removed the ignoring of weird nodes.Ivo Danihelka
2013-07-17Reused the computed gradOutputBuffer.Ivo Danihelka
2013-07-17Clarified the meaning of innode.data.gradOutput.Ivo Danihelka
2013-07-17Kept only gradients to sum inside of the data.gradOutput.Ivo Danihelka
2013-07-17Used common code to get the gradOutput.Ivo Danihelka