Jump to content

Torch (machine learning): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Strife911 (talk | contribs)
No edit summary
Strife911 (talk | contribs)
torch OOP and serialization
Line 37: Line 37:


== torch ==
== torch ==
The main package [https://github.com/torch/torch7 torch] provides a powerful N-dimensional array or [[Tensor]], which supports routines for indexing, slicing, transposing, type-casting, resizing and storage sharing. This object is used by most other packages and thus forms the core object of the library.
The main package [https://github.com/torch/torch7 torch] provides a flexible N-dimensional array or [[Tensor]], which supports routines for indexing, slicing, transposing, type-casting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library.

The torch package also simplifies [[object oriented programming]] and [[serialization]] by providing a various convenience functions which are used throughout its packages. The <code>torch.class(classname, parentclass)</code> function can be used to create [[Factory_method_pattern|object factories]] ([[Class_(computer_programming)|classes]]. When the [[Constructor_(object-oriented_programming)|constructor]] is called, torch initializes and sets a Lua [[Lua_(programming_language)#Tables|table]] with the user-defined [[Lua_(programming_language)#Metatables|metatable]], which makes the table an [[Object_(computer_science)|object]].

Objects created with the torch factory can also be serialized, as long as they do not contain references to objects that cannot be serialized, such as Lua [[Coroutine|coroutines]], and Lua ''userdata''. However, ''userdata'' can be serialized if it is wrapped by a table (or metatable) that provides <code>read()</code> and <code>write()</code> methods.


== nn ==
== nn ==

Revision as of 20:17, 24 April 2014

Torch
Original author(s)Ronan Collobert, Koray Kavukcuoglu, Clement Farabet
Initial releaseOctober 2002; 22 years ago (2002-10)[1]
Stable release
7.0 / April 24, 2014; 11 years ago (2014-04-24)[2]
Repository
Written inLua, LuaJIT, C, CUDA and C++
Operating systemLinux, Mac OS X, Microsoft Windows
TypeLibrary for deep learning
LicenseBSD License
Websitetorch.ch

Torch is an open source deep learning library for the Lua programming language.[3] Torch is a scientific computing framework with wide support for machine learning algorithms. It is divided into different packages each accessible in its own GitHub repository. These package can be installed using the luarocks package manager. Although Torch officially supports the torch,nn,optim,gnuplot, cutorch, cunn, paths, image, trepl, cwrap and qtlua packages, a variety of other Torch packages are available via a luarocks summary page. If uses a fast scripting language LuaJIT, and an underlying C implementation.

torch

The main package torch provides a flexible N-dimensional array or Tensor, which supports routines for indexing, slicing, transposing, type-casting, resizing, sharing storage and cloning. This object is used by most other packages and thus forms the core object of the library.

The torch package also simplifies object oriented programming and serialization by providing a various convenience functions which are used throughout its packages. The torch.class(classname, parentclass) function can be used to create object factories (classes. When the constructor is called, torch initializes and sets a Lua table with the user-defined metatable, which makes the table an object.

Objects created with the torch factory can also be serialized, as long as they do not contain references to objects that cannot be serialized, such as Lua coroutines, and Lua userdata. However, userdata can be serialized if it is wrapped by a table (or metatable) that provides read() and write() methods.

nn

The nn packages provides for neural network and energy-based models. It is divided into modular objects that share a common Module interface. Modules have a forward and backward function that allow then to feedforward and backpropagate. Modules can be joined together using Containers, which is itself a Module, to create complex task-tailored graphs.

cwrap

Via LuaJIT, Torch provides a simple interface to C. For example, the following C prototype:

TH_API void THTensor_(rand)(THTensor *r_, THGenerator *_generator, THLongStorage *size);

can be adapted to Lua with default values :

local wrap = require 'cwrap'
wrap("rand",
      cname("rand"),
      {{name=Tensor, default=true, returned=true, method={default='nil'}},
      {name='Generator', default=true},
      {name="LongArg"}})

where cname("rand") specifies the C function name while the first "rand" specifies the Lua function name.

By this same principle, many basic linear algebra routines (BLAS) have been adapted to work with Torch Tensors. Furthermore, via its cutorch package, various CUDA routines have been adapted to work with the special CudaTensor. This allows Torch to utilize fast NVIDIA GPU for some of its supported operations.

References

  1. ^ "Torch: a modular machine learning software library". 30 October 2002. Retrieved 24 April 2014.
  2. ^ Ronan Collobert. "Torch7". GitHub.
  3. ^ Ronan Collobert; Koray Kavukcuoglu; Clement Farabet (2011). "Torch7: A Matlab-like Environment for Machine Learning" (PDF). Neural Information Processing Systems.