diff options
author | Malcolm Reynolds <mareynolds@google.com> | 2016-02-22 21:11:49 +0300 |
---|---|---|
committer | Malcolm Reynolds <mareynolds@google.com> | 2016-02-22 21:11:49 +0300 |
commit | 1394f5ce764aa7f4aecfd1c5945ba15a03c141e7 (patch) | |
tree | 8bb1181ccd1c77ab5af50bd344dc5fe415cc0adf /utils.lua | |
parent | 096bc70ff078b179c0ca43be5efc9469247005a5 (diff) |
Make nn.utils.recursiveType add empty Tensors to the tensorCache.
This means if we have a module which wraps another module, and does
something like:
function MyWrapper:__init(innerModule)
self._innerModule = innerModule
self.gradInput = self._innerModule.gradInput
end
.. Then calling :type before we have forwarded anything (and when we
still have zero-size Tensors) will now correctly preserve this aliasing.
Diffstat (limited to 'utils.lua')
-rw-r--r-- | utils.lua | 2 |
1 files changed, 1 insertions, 1 deletions
@@ -64,8 +64,8 @@ function nn.utils.recursiveType(param, type, tensorCache) param:size(), param:stride() ) - tensorCache[param] = newparam end + tensorCache[param] = newparam end assert(torch.type(newparam) == type) param = newparam |