Jump to content

Torch (machine learning): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Strife911 (talk | contribs)
m Strife911 moved page Torch (library) to Torch (machine learning): library is ambiguous. machine learning is more to the point. See Weka (machine learning) for a precedent.
Strife911 (talk | contribs)
m torch: added Tensor example
Line 35: Line 35:


== torch ==
== torch ==
The core package of Torch is [https://github.com/torch/torch7 torch]. It provides a flexible N-dimensional array or [[Tensor]], which supports routines for indexing, slicing, transposing, type-casting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library.
The core package of Torch is [https://github.com/torch/torch7 torch]. It provides a flexible N-dimensional array or [[Tensor]], which supports routines for indexing, slicing, transposing, type-casting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library.

The following is an example for using torch via its [[REPL]] interpreter:
<syntaxhighlight lang="lua">
$> th
> a = torch.randn(3,4)

> =a
-0.2381 -0.3401 -1.7844 -0.2615
0.1411 1.6249 0.1708 0.8299
-1.0434 2.2291 1.0525 0.8465
[torch.DoubleTensor of dimension 3x4]

> a[1][2]
-0.34010116549482
> a:narrow(1,1,2)
-0.2381 -0.3401 -1.7844 -0.2615
0.1411 1.6249 0.1708 0.8299
[torch.DoubleTensor of dimension 2x4]

> a:index(1, torch.LongTensor{1,2})
-0.2381 -0.3401 -1.7844 -0.2615
0.1411 1.6249 0.1708 0.8299
[torch.DoubleTensor of dimension 2x4]

> a:min()
-1.7844365427828
</syntaxhighlight>


The torch package also simplifies [[object oriented programming]] and [[serialization]] by providing a various convenience functions which are used throughout its packages. The <code>torch.class(classname, parentclass)</code> function can be used to create [[Factory method pattern|object factories]] ([[Class (computer programming)|classes]]. When the [[Constructor (object-oriented programming)|constructor]] is called, torch initializes and sets a Lua [[Lua (programming language)#Tables|table]] with the user-defined [[Lua (programming language)#Metatables|metatable]], which makes the table an [[Object (computer science)|object]].
The torch package also simplifies [[object oriented programming]] and [[serialization]] by providing a various convenience functions which are used throughout its packages. The <code>torch.class(classname, parentclass)</code> function can be used to create [[Factory method pattern|object factories]] ([[Class (computer programming)|classes]]. When the [[Constructor (object-oriented programming)|constructor]] is called, torch initializes and sets a Lua [[Lua (programming language)#Tables|table]] with the user-defined [[Lua (programming language)#Metatables|metatable]], which makes the table an [[Object (computer science)|object]].

Revision as of 00:41, 1 June 2014

Torch
Original author(s)Ronan Collobert, Koray Kavukcuoglu, Clement Farabet
Initial releaseOctober 2002; 22 years ago (2002-10)[1]
Stable release
7.0 / April 24, 2014; 11 years ago (2014-04-24)[2]
Repository
Written inLua, LuaJIT, C, CUDA and C++
Operating systemLinux, Android, Mac OS X, iOS, Microsoft Windows
TypeLibrary for deep learning
LicenseBSD License
Websitetorch.ch

Torch is an open source deep learning library for the Lua programming language [3] and a scientific computing framework with wide support for machine learning algorithms. It uses a fast scripting language LuaJIT, and an underlying C implementation.

torch

The core package of Torch is torch. It provides a flexible N-dimensional array or Tensor, which supports routines for indexing, slicing, transposing, type-casting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library.

The following is an example for using torch via its REPL interpreter:

$> th
> a = torch.randn(3,4) 

> =a
-0.2381 -0.3401 -1.7844 -0.2615
 0.1411  1.6249  0.1708  0.8299
-1.0434  2.2291  1.0525  0.8465
[torch.DoubleTensor of dimension 3x4]

> a[1][2]
-0.34010116549482
	
> a:narrow(1,1,2)
-0.2381 -0.3401 -1.7844 -0.2615
 0.1411  1.6249  0.1708  0.8299
[torch.DoubleTensor of dimension 2x4]

> a:index(1, torch.LongTensor{1,2})
-0.2381 -0.3401 -1.7844 -0.2615
 0.1411  1.6249  0.1708  0.8299
[torch.DoubleTensor of dimension 2x4]

> a:min()
-1.7844365427828

The torch package also simplifies object oriented programming and serialization by providing a various convenience functions which are used throughout its packages. The torch.class(classname, parentclass) function can be used to create object factories (classes. When the constructor is called, torch initializes and sets a Lua table with the user-defined metatable, which makes the table an object.

Objects created with the torch factory can also be serialized, as long as they do not contain references to objects that cannot be serialized, such as Lua coroutines, and Lua userdata. However, userdata can be serialized if it is wrapped by a table (or metatable) that provides read() and write() methods.

nn

The nn package is used for building neural networks. It is divided into modular objects that share a common Module interface. Modules have a forward() and backward() method that allow them to feedforward and backpropagate, respectively. Modules can be joined together using module composites, like Sequential, Parallel and Concat to create complex task-tailored graphs. Simpler modules like Linear, Tanh and Max make up the basic component modules. This modular interface provides first-order automatic gradient differentiation.

Loss functions are implemented as sub-classes of Criterion, which has a similar interface to Module. It also has forward() and backward methods for computing the loss and backpropagating gradients, respectively. Criteria are helpful to train neural network on classical tasks. Common criteria are the Mean Squared Error criterion implemented in MSECriterion and the cross-entropy criterion implemented in ClassNLLCriterion.

It also has StochasticGradient class for training a neural network using Stochastic gradient descent, although the Optim package provides much more options in this respect, like momentum and weight decay regularization.

cwrap

Via LuaJIT, Torch provides a simple interface to C. For example, the following C prototype:

TH_API void THTensor_(rand)(THTensor *r_, THGenerator *_generator, THLongStorage *size);

can be adapted to Lua with default values :

local wrap = require 'cwrap'
wrap("rand",
      cname("rand"),
      {{name=Tensor, default=true, returned=true, method={default='nil'}},
      {name='Generator', default=true},
      {name="LongArg"}})

where cname("rand") specifies the C function name while the first "rand" specifies the Lua function name.

By this same principle, many basic linear algebra routines (BLAS) have been adapted to work with Torch Tensors. Furthermore, via its cutorch package, various CUDA routines have been adapted to work with the special CudaTensor. This allows Torch to utilize fast NVIDIA GPUs for some of its supported operations.

Other packages

Many packages other than the above official packages are used with Torch. These are listed in the torch cheatsheet. These extra packages provide a wide range of utilities such as parallelism, asynchronous input/output, image processing, and so on.

Applications

Torch is used by Google DeepMind,[4] the Facebook AI Research Group,[5] the Computational Intelligence, Learning, Vision, and Robotics Lab at NYU[6] and the Idiap Research Institute.[7] It is used and cited in 240 research papers.[8] For comparison, Theano, a similar library written in Python (programming language), C and CUDA, has 138 citations.[9] Torch has been extended for use on Android[10] and iOS[11] It has been used to build hardware implementations for data flows like those found in neural networks.[12]

References

  1. ^ "Torch: a modular machine learning software library". 30 October 2002. Retrieved 24 April 2014.
  2. ^ Ronan Collobert. "Torch7". GitHub.
  3. ^ Ronan Collobert; Koray Kavukcuoglu; Clement Farabet (2011). "Torch7: A Matlab-like Environment for Machine Learning" (PDF). Neural Information Processing Systems.
  4. ^ What is going on with DeepMind and Google?
  5. ^ KDnuggets Interview with Yann LeCun, Deep Learning Expert, Director of Facebook AI Lab
  6. ^ CILVR Lab Software
  7. ^ IDIAP Research Institute : Torch
  8. ^ Google Scholar results for Torch: a modular machine learning software library citations
  9. ^ Theano: a CPU and GPU math expression compiler
  10. ^ Torch-android GitHub repository
  11. ^ Torch-ios GitHub repository
  12. ^ NeuFlow: A Runtime Reconfigurable Dataflow Processor for Vision