Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/nn.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorMalcolm Reynolds <mareynolds@google.com>2016-02-22 21:11:49 +0300
committerMalcolm Reynolds <mareynolds@google.com>2016-02-22 21:11:49 +0300
commit1394f5ce764aa7f4aecfd1c5945ba15a03c141e7 (patch)
tree8bb1181ccd1c77ab5af50bd344dc5fe415cc0adf /utils.lua
parent096bc70ff078b179c0ca43be5efc9469247005a5 (diff)
Make nn.utils.recursiveType add empty Tensors to the tensorCache.
This means if we have a module which wraps another module, and does something like: function MyWrapper:__init(innerModule) self._innerModule = innerModule self.gradInput = self._innerModule.gradInput end .. Then calling :type before we have forwarded anything (and when we still have zero-size Tensors) will now correctly preserve this aliasing.
Diffstat (limited to 'utils.lua')
-rw-r--r--utils.lua2
1 files changed, 1 insertions, 1 deletions
diff --git a/utils.lua b/utils.lua
index 6084f09..0035f48 100644
--- a/utils.lua
+++ b/utils.lua
@@ -64,8 +64,8 @@ function nn.utils.recursiveType(param, type, tensorCache)
param:size(),
param:stride()
)
- tensorCache[param] = newparam
end
+ tensorCache[param] = newparam
end
assert(torch.type(newparam) == type)
param = newparam