Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/torch/dok.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorkoray kavukcuoglu <koray@kavukcuoglu.org>2012-02-13 09:33:09 +0400
committerkoray kavukcuoglu <koray@kavukcuoglu.org>2012-02-13 09:33:09 +0400
commit7bb7d7d2bbc2ec7499bc77a2334fbcb35e045958 (patch)
tree3850d20c703b851563ca817c87eea1903c33ed58
parente39ed8bc955c112029789e93e208f813eeaa20e5 (diff)
big pass over documentation to make titles consistent across all packages..
-rw-r--r--doklua/index.dok14
-rw-r--r--doktutorial/index.dok26
2 files changed, 20 insertions, 20 deletions
diff --git a/doklua/index.dok b/doklua/index.dok
index 6baad95..8728c6b 100644
--- a/doklua/index.dok
+++ b/doklua/index.dok
@@ -13,7 +13,7 @@ information, or have a look on the [[LuaManual|Lua Reference Manual]].
===== Why choose Lua? =====
-=== Lua is a proven and robust language ===
+==== Lua is a proven and robust language ====
Lua has been used in
[[http://www.lua.org/uses.html|many industrial applications]] (e.g.,
@@ -25,7 +25,7 @@ is currently the leading scripting language in games. Lua has a solid
[[http://www.lua.org/versions.html|versions]] of Lua have been released
and used in real applications since its creation in 1993.
-=== Lua is fast ===
+==== Lua is fast ====
Lua has a deserved reputation for performance. To
claim to be "as fast as Lua" is an aspiration of other scripting
@@ -34,7 +34,7 @@ of interpreted scripting languages. Lua is fast not only in fine-tuned
benchmark programs, but in real life too. A substantial fraction of large
applications have been written in Lua.
-=== Lua is portable ===
+==== Lua is portable ====
Lua is [[http://www.lua.org/download.html|distributed]] in a small
package that builds out-of-the-box in all platforms that have an ''ANSI/ISO C''
@@ -43,7 +43,7 @@ devices (such as handheld computers and cell phones that use ''BREW'', ''Symbian
''Pocket PC'', etc.) and embedded microprocessors (such as ''ARM'' and ''Rabbit'') for
applications like ''Lego MindStorms''.
-=== Lua is embeddable ===
+==== Lua is embeddable ====
Lua is a fast language engine with small footprint that you can embed into
your application. Lua has a simple and well documented ''API'' that allows
@@ -55,7 +55,7 @@ extend programs written not only in ''C'' and ''C++'', but also in ''Java'', ''C
such as
''Perl'' and ''Ruby''.
-=== Lua is simple and powerful ===
+==== Lua is simple and powerful ====
A fundamental concept in the design of Lua is to provide //meta-mechanisms//
for implementing features, instead of providing a host of features directly
@@ -65,14 +65,14 @@ inheritance. Lua's meta-mechanisms bring an economy of concepts and keep
the language small, while allowing the semantics to be extended in
unconventional ways.
-=== Lua is free ===
+==== Lua is free ====
Lua is free software, distributed under a
[[http://www.lua.org/license.html|liberal license]] (the well-known ''MIT''
license). It can be used for both academic and commercial purposes at
absolutely no cost. Just [[http://www.lua.org/download.html|download]] it and use it.
-===== Where does Lua come from? =====
+==== Where does Lua come from? ====
Lua is designed and implemented by a team at
[[http://www.puc-rio.br/|PUC-Rio]], the Pontifical Catholic University of
diff --git a/doktutorial/index.dok b/doktutorial/index.dok
index a11a95e..b424a14 100644
--- a/doktutorial/index.dok
+++ b/doktutorial/index.dok
@@ -10,7 +10,7 @@ vectors, matrices and tensors and how to build and train a basic
neural network. For anything else, you should know how to access the
html help and read about how to do it.
-====== What is Torch? ======
+===== What is Torch? =====
Torch7 provides a Matlab-like environment for state-of-the-art machine
learning algorithms. It is easy to use and provides a very efficient
@@ -18,14 +18,14 @@ implementation, thanks to a easy and fast scripting language (Lua) and
an underlying C/C++ implementation. You can read more about Lua
[[http://www.lua.org|here]].
-====== Installation ======
+===== Installation =====
First before you can do anything, you need to install Torch7 on your
machine. That is not described in detail here, but is instead
described in the [[..:install:index|installation help]].
-====== Checking your installation works and requiring packages ======
+===== Checking your installation works and requiring packages =====
If you have got this far, hopefully your Torch installation works. A simple
way to make sure it does is to start Lua from the shell command line,
@@ -55,7 +55,7 @@ or higher-dimensional objects (tensors).
Tensors). To see the list of all packages distributed with Torch7,
click [[..:index|here]].
-====== Getting Help ======
+===== Getting Help =====
There are two main ways of getting help in Torch7. One way is ofcourse
the html formatted help. However, another and easier method is to use
@@ -100,7 +100,7 @@ t7> torch.randn(
</file>
-====== Torch Basics: Playing with Tensors ======
+===== Torch Basics: Playing with Tensors =====
Ok, now we are ready to actually do something in Torch. Lets start by
constructing a vector, say a vector with 5 elements, and filling the
@@ -229,7 +229,7 @@ t7> =torch.mm(a,b)
</file>
-====== Types in Torch7 ======
+===== Types in Torch7 =====
In Torch7, different types of tensors can be used. By default, all
tensors are created using ''double'' type. ''torch.Tensor'' is a
@@ -244,12 +244,12 @@ t7> =torch.Tensor()
[torch.FloatTensor with no dimension]
</file>
-====== Example: training a neural network ======
+===== Example: training a neural network =====
We will show now how to train a neural network using the [[..:nn:index|nn]] package
available in Torch.
-===== Torch basics: building a dataset using Lua tables =====
+==== Torch basics: building a dataset using Lua tables ====
In general the user has the freedom to create any kind of structure he
wants for dealing with data.
@@ -302,7 +302,7 @@ for i=1,dataset:size() do
end
</file>
-===== Torch basics: building a neural network =====
+==== Torch basics: building a neural network ====
To train a neural network we first need some data. We can use the XOR data
we just generated in the section before. Now all that remains is to define
@@ -336,7 +336,7 @@ mlp:add(nn.Linear(HUs,outputs))
</file>
-===== Torch basics: training a neural network =====
+==== Torch basics: training a neural network ====
Now we're ready to train.
This is done with the following code:
@@ -373,7 +373,7 @@ See the nn package description of the
for more details.
-===== Torch basics: testing your neural network =====
+==== Torch basics: testing your neural network ====
To test your network on a single example you can do this:
<file lua>
@@ -402,7 +402,7 @@ t7> x=torch.Tensor(2); x[1]=-0.5; x[2]=-0.5; print(mlp:forward(x))
[torch.DoubleTensor of dimension 1]
</file>
-===== Manual Training of a Neural Network =====
+==== Manual Training of a Neural Network ====
Instead of using the [[..:nn:index#nn.StochasticGradient|StochasticGradient]] class
you can directly make the forward and backward calls on the network yourself.
@@ -448,7 +448,7 @@ end
Super!
-====== Concluding remarks ======
+===== Concluding remarks =====
That's the end of this tutorial, but not the end of what you have left
to discover of Torch! To explore more of Torch, you should take a look