Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/nn.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorNicholas Léonard <nick@nikopia.org>2014-05-11 11:00:02 +0400
committerNicholas Léonard <nick@nikopia.org>2014-05-11 11:00:02 +0400
commita3a52fa931eac14f44b6350181a6195b701b74f2 (patch)
tree039d6fed67c594367c3379254358804f1c5689c9 /README.md
parent65ef0e3f63850d078a80c4adf4b2b62ea14b52e0 (diff)
LookupTable doc typos (wrong axes order)
Diffstat (limited to 'README.md')
-rw-r--r--README.md18
1 files changed, 9 insertions, 9 deletions
diff --git a/README.md b/README.md
index 4a71501..5048271 100644
--- a/README.md
+++ b/README.md
@@ -1732,7 +1732,7 @@ at `1` and can go up to `nIndex`. For each index, it outputs a corresponding `Te
specified by `sizes` (a `LongStorage`) or `size1 x size2 x...`.
Given a 1D input, the output tensors are concatenated,
-generating a `size1 x size2 x ... x sizeN x n` tensor, where `n`
+generating a `n x size1 x size2 x ... x sizeN` tensor, where `n`
is the size of a 1D `input` tensor.
Again with a 1D input, when only `size1` is provided, the `forward(input)` is equivalent to
@@ -1748,21 +1748,21 @@ where `M` is a 2D matrix `size1 x nIndex` containing the parameters of the looku
-- a lookup table containing 10 tensors of size 3
module = nn.LookupTable(10, 3)
- input = torch.Tensor(4)
- input[1] = 1; input[2] = 2; input[3] = 1; input[4] = 10;
+ input = torch.Tensor{1,2,1,10}
print(module:forward(input))
```
Outputs something like:
```lua
--0.1784 2.2045 -0.1784 -0.2475
--1.0120 0.0537 -1.0120 -0.2148
--1.2840 0.8685 -1.2840 -0.2792
-[torch.Tensor of dimension 3x4]
+-1.4415 -0.1001 -0.1708
+-0.6945 -0.4350 0.7977
+-1.4415 -0.1001 -0.1708
+-0.0745 1.9275 1.0915
+[torch.DoubleTensor of dimension 4x3]
```
-Note that the first column vector is the same than the 3rd one!
+Note that the first row vector is the same as the 3rd one!
-Given a 2D input tensor of size `m x n`, the output is a `m x size1 x size2 x ... x sizeN x n`
+Given a 2D input tensor of size `m x n`, the output is a `m x n x size1 x size2 x ... x sizeN`
tensor, where `m` is the number of samples in
the batch and `n` is the number of indices per sample.