Skip to content

Commit

Permalink
Revamped tutorials and API reference pages. Adding examples from mxne…
Browse files Browse the repository at this point in the history
…t repo to tutorials sections (#3545)

* Fixing broken links and format issues in getting started and architecture pages

* Fixing all broken links. Change all references from mxnet.readthedocs.org to mxnet.io.

* Re-organized tutorials and api reference documents. Added tutorials from mxnet repo examples folder.
  • Loading branch information
sandeep-krishnamurthy authored and piiswrong committed Oct 17, 2016
1 parent 4d9ac5b commit 1c49718
Show file tree
Hide file tree
Showing 51 changed files with 1,917 additions and 953 deletions.
26 changes: 13 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,14 @@
![banner](https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/banner.png)

MXNet is a deep learning framework designed for both *efficiency* and *flexibility*.
It allows you to ***mix*** the [flavours](http://mxnet.readthedocs.io/en/latest/architecture/program_model.html) of symbolic
It allows you to ***mix*** the [flavours](http://mxnet.io/architecture/index.html#deep-learning-system-design-concepts) of symbolic
programming and imperative programming to ***maximize*** efficiency and productivity.
In its core, a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly.
A graph optimization layer on top of that makes symbolic execution fast and memory efficient.
The library is portable and lightweight, and it scales to multiple GPUs and multiple machines.

MXNet is also more than a deep learning project. It is also a collection of
[blue prints and guidelines](http://mxnet.readthedocs.io/en/latest/architecture/index.html#system-design-note) for building
[blue prints and guidelines](http://mxnet.io/architecture/index.html#deep-learning-system-design-concepts) for building
deep learning system, and interesting insights of DL systems for hackers.

[![Join the chat at https://gitter.im/dmlc/mxnet](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/dmlc/mxnet?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Expand All @@ -24,25 +24,25 @@ What's New
----------
* [MXNet Memory Monger, Training Deeper Nets with Sublinear Memory Cost](https://github.com/dmlc/mxnet-memonger)
* [Tutorial for NVidia GTC 2016](https://github.com/dmlc/mxnet-gtc-tutorial)
* [Embedding Torch layers and functions in MXNet](http://mxnet.readthedocs.org/en/latest/how_to/torch.html)
* [Embedding Torch layers and functions in MXNet](http://mxnet.io/how_to/torch.html)
* [MXNet.js: Javascript Package for Deep Learning in Browser (without server)
](https://github.com/dmlc/mxnet.js/)
* [Design Note: Design Efficient Deep Learning Data Loading Module](http://mxnet.readthedocs.org/en/latest/architecture/note_data_loading.htmll)
* [MXNet on Mobile Device](http://mxnet.readthedocs.org/en/latest/how_to/smart_device.html)
* [Distributed Training](http://mxnet.readthedocs.org/en/latest/how_to/multi_devices.html)
* [Guide to Creating New Operators (Layers)](http://mxnet.readthedocs.org/en/latest/how_to/new_op.html)
* [Design Note: Design Efficient Deep Learning Data Loading Module](http://mxnet.io/architecture/note_data_loading.html)
* [MXNet on Mobile Device](http://mxnet.io/how_to/smart_device.html)
* [Distributed Training](http://mxnet.io/how_to/multi_devices.html)
* [Guide to Creating New Operators (Layers)](http://mxnet.io/how_to/new_op.html)
* [Amalgamation and Go Binding for Predictors](https://github.com/jdeng/gomxnet/)
* [Training Deep Net on 14 Million Images on A Single Machine](http://mxnet.readthedocs.org/en/latest/tutorials/imagenet_full.html)
* [Training Deep Net on 14 Million Images on A Single Machine](http://mxnet.io/tutorials/computer_vision/imagenet_full.html)

Contents
--------
* [Documentation and Tutorials](http://mxnet.readthedocs.org/en/latest/)
* [Design Notes](http://mxnet.readthedocs.org/en/latest/architecture/index.html)
* [Documentation and Tutorials](http://mxnet.io/)
* [Design Notes](http://mxnet.io/architecture/index.html)
* [Code Examples](example)
* [Installation](http://mxnet.readthedocs.org/en/latest/how_to/build.html)
* [Installation](http://mxnet.io/get_started/setup.html)
* [Pretrained Models](https://github.com/dmlc/mxnet-model-gallery)
* [Contribute to MXNet](http://mxnet.readthedocs.org/en/latest/how_to/contribute.html)
* [Frequent Asked Questions](http://mxnet.readthedocs.org/en/latest/how_to/faq.html)
* [Contribute to MXNet](http://mxnet.io/community/contribute.html)
* [Frequent Asked Questions](http://mxnet.io/how_to/faq.html)

Features
--------
Expand Down
10 changes: 8 additions & 2 deletions docs/api/c++/index.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,8 @@
# MXNet C++ Package
Please refer to [https://github.com/dmlc/MXNet.cpp](https://github.com/dmlc/MXNet.cpp)
# MXNet - C++ API

Please refer below for Namespaces, Classes and code files of MXNet C++ package.

* [Namespaces](http://mxnet.io/doxygen/namespaces.html)
* [Classes](http://mxnet.io/doxygen/annotated.html)
* [Code Files](http://mxnet.io/doxygen/files.html)
* [MXNet CPP Package](https://github.com/dmlc/MXNet.cpp)
12 changes: 11 additions & 1 deletion docs/api/julia/index.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,12 @@
# MXNet Julia Package
# MXNet - Julia API
MXNet supports Julia programming language. The MXNet Julia package brings flexible and efficient GPU
computing and state-of-art deep learning to Julia.

- It enables you to write seamless tensor/matrix computation with multiple GPUs in R.
- It also enables you construct and customize the state-of-art deep learning models in R,
and apply them to tasks such as image classification and data science challenges.


 

Julia documents are available at [http://dmlc.ml/MXNet.jl/latest/](http://dmlc.ml/MXNet.jl/latest/).
24 changes: 9 additions & 15 deletions docs/api/python/index.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,15 @@
MXNet Python Package
====================
This page contains links to all the python related documents on python package.
To install the python package, checkout [Build and Installation Instruction](../../how_to/build.md).
There are three types of documents you can find about MXNet.
# MXNet - Python API

* [Tutorials](#tutorials) are self contained materials that introduces a certain use-cases of MXNet.
* [Code Examples](../../../example) contains example codes.
* [Python API Documents](#python-api-documents) contains documents about specific module, as well as reference of all API functions.
## Introduction
MXNet supports Python programming language. The MXNet Python package brings flexible and efficient GPU
computing and state-of-art deep learning to Python.

Tutorials
---------
* [Python Overview Tutorial](tutorial.md)
* [Symbolic Configuration and Execution in Pictures](symbol_in_pictures.md)
* [How to Create New Operations (Layers)](../../how_to/new_op.md)
- It enables you to write seamless tensor/matrix computation with multiple GPUs in Python.
- It also enables you construct and customize the state-of-art deep learning models in Python,
and apply them to tasks such as image classification and data science challenges.

Python API Reference
--------------------

## Python API Reference
* [Module API](module.md) a flexible high-level interface for training neural networks
* [Model API](model.md) an alternate simple high-level interface for training neural networks
* [Symbolic API](symbol.md) for operations on NDArrays to assemble neural networks from layers
Expand Down
3 changes: 3 additions & 0 deletions docs/api/python/io.md
Original file line number Diff line number Diff line change
Expand Up @@ -178,3 +178,6 @@ IO API Reference
<script>auto_index("mxnet.io");</script>
```
# Recommended Next Steps
* [NDArray API](ndarray.md) for vector/matrix/tensor operations
* [KVStore API](kvstore.md) for multi-GPU and multi-host distributed training
3 changes: 3 additions & 0 deletions docs/api/python/kvstore.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,3 +135,6 @@ update on key: 9
<script>auto_index("mxnet.kvstore");</script>
```

# Recommended Next Steps
* [Python Tutorials](http://mxnet.io/tutorials/index.html#Python-Tutorials)
42 changes: 24 additions & 18 deletions docs/api/python/model.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
MXNet Python Model API
======================
# MXNet Python Model API

The model API is a simplified way to train neural networks using common best practices.
It is a thin wrapper build on top of [ndarray](ndarray.md) and [symbolic](symbol.md)
modules to make neural network training easy.
Expand All @@ -11,8 +11,8 @@ modules to make neural network training easy.
* [Evaluation Metric API Reference](#initializer-api-reference)
* [Optimizer API Reference](#optimizer-api-reference)

Train a Model
-------------
## Train a Model

To train a model, you can follow two steps, first a configuration using symbol,
then call ```model.Feedforward.create``` to create a model for you.
The following example creates a two layer neural networks.
Expand Down Expand Up @@ -43,8 +43,8 @@ model.fit(X=data_set)
```
For more information, you can refer to [Model API Reference](#model-api-reference).

Save the Model
--------------
# Save the Model

It is important to save your work after the job done.
To save the model, you can directly pickle it if you like the pythonic way.
We also provide a save and load function.
Expand All @@ -61,8 +61,8 @@ model_loaded = mx.model.FeedForward.load(prefix, iteration)
The advantage of this save and load function is they are language agnostic,
and you should be able to save and load directly into cloud storage such as S3 and HDFS.

Periodically Checkpoint
-----------------------
# Periodically Checkpoint

It is also helpful to periodically checkpoint your model after each iteration.
To do so, you can simply add a checkpoint callback ```do_checkpoint(path)``` to the function.
The training process will automatically checkpoint to the specified place after
Expand All @@ -78,8 +78,8 @@ model = mx.model.FeedForward.create(
```
You can load the model checkpoint later using ```Feedforward.load```.

Use Multiple Devices
--------------------
# Use Multiple Devices

Simply set ```ctx``` to be the list of devices you like to train on.

```python
Expand All @@ -99,8 +99,8 @@ The training will be done in a data parallel way on the GPUs you specified.
```


Initializer API Reference
-------------------------
# Initializer API Reference


```eval_rst
.. automodule:: mxnet.initializer
Expand All @@ -111,8 +111,8 @@ Initializer API Reference
<script>auto_index("mxnet.initializer");</script>
```

Evaluation Metric API Reference
-------------------------------
# Evaluation Metric API Reference


```eval_rst
.. automodule:: mxnet.metric
Expand All @@ -123,8 +123,8 @@ Evaluation Metric API Reference
<script>auto_index("mxnet.metric");</script>
```

Optimizer API Reference
-----------------------
# Optimizer API Reference


```eval_rst
.. automodule:: mxnet.optimizer
Expand All @@ -135,8 +135,8 @@ Optimizer API Reference
<script>auto_index("mxnet.optimizer");</script>
```

Model API Reference
-------------------
# Model API Reference


```eval_rst
.. automodule:: mxnet.model
Expand All @@ -146,3 +146,9 @@ Model API Reference
<script>auto_index("mxnet.model");</script>
```

# Recommended Next Steps
* [Symbolic API](symbol.md) for operations on NDArrays to assemble neural networks from layers
* [IO Data Loading API](io.md) for parsing and loading data
* [NDArray API](ndarray.md) for vector/matrix/tensor operations
* [KVStore API](kvstore.md) for multi-GPU and multi-host distributed training
36 changes: 19 additions & 17 deletions docs/api/python/module.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,7 @@
Module Interface HowTo
======================

# Module Interface - How To
The module API provide intermediate-level and high-level interface for computation with neural networks in MXNet. A "module" is an instance of subclasses of `BaseModule`. The most widely used module class is simply called `Module`, which wraps a `Symbol` and one or more `Executor`s. Please refer to the API doc for `BaseModule` below for a full list of functions available. Each specific subclass of modules might have some extra interface functions. We provide here some examples of common use cases. All the module APIs live in the namespace of `mxnet.module` or simply `mxnet.mod`.

Preparing a module for computation
----------------------------------
## Preparing a module for computation

To construct a module, refer to the constructors of the specific module class. For example, the `Module` class takes a `Symbol` as input,

Expand Down Expand Up @@ -36,8 +33,7 @@ mod.init_params()

Now you can compute with the module via functions like `forward()`, `backward()`, etc. If you simply want to fit a module, you do not need to call `bind()` and `init_params()` explicitly, as the `fit()` function will call them automatically if needed.

Training, Predicting, and Evaluating
------------------------------------
## Training, Predicting, and Evaluating

Modules provide high-level APIs for training, predicting and evaluating. To fit a module, simply call the `fit()` function with some `DataIter`s:

Expand Down Expand Up @@ -71,8 +67,8 @@ mod.score(val_dataiter, metric)

It will run predictions on each batch in the provided `DataIter` and compute the evaluation score using the provided `EvalMetric`. The evaluation results will be stored in `metric` so that you can query later on.

Saving and Loading Module Parameters
------------------------------------
## Saving and Loading Module Parameters


You can save the module parameters in each training epoch by using a `checkpoint` callback.

Expand Down Expand Up @@ -103,17 +99,16 @@ mod.fit(..., arg_params=arg_params, aux_params=aux_params,
Note we also pass in `begin_epoch` so that `fit()` knows we are resuming from a previous saved epoch.


Module Interface API
====================
# Module Interface API


```eval_rst
.. raw:: html
<script type="text/javascript" src='../../_static/js/auto_module_index.js'></script>
```

The BaseModule Interface
------------------------
## The BaseModule Interface

```eval_rst
.. automodule:: mxnet.module.base_module
Expand All @@ -124,8 +119,8 @@ The BaseModule Interface
<script>auto_index("mxnet.module.base_module");</script>
```

The Built-in Modules
--------------------
## The Built-in Modules


```eval_rst
.. automodule:: mxnet.module.module
Expand Down Expand Up @@ -154,8 +149,8 @@ The Built-in Modules
<script>auto_index("mxnet.module.sequential_module");</script>
```

Writing Modules in Python
-------------------------
## Writing Modules in Python


```eval_rst
.. automodule:: mxnet.module.python_module
Expand All @@ -165,3 +160,10 @@ Writing Modules in Python
<script>auto_index("mxnet.module.python_module");</script>
```

# Recommended Next Steps
* [Model API](model.md) an alternate simple high-level interface for training neural networks
* [Symbolic API](symbol.md) for operations on NDArrays to assemble neural networks from layers
* [IO Data Loading API](io.md) for parsing and loading data
* [NDArray API](ndarray.md) for vector/matrix/tensor operations
* [KVStore API](kvstore.md) for multi-GPU and multi-host distributed training
3 changes: 3 additions & 0 deletions docs/api/python/ndarray.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,3 +150,6 @@ Context API Reference
<script>auto_index("mxnet.context");</script>
```

# Recommended Next Steps
* [KVStore API](kvstore.md) for multi-GPU and multi-host distributed training
7 changes: 7 additions & 0 deletions docs/api/python/symbol.md
Original file line number Diff line number Diff line change
Expand Up @@ -192,3 +192,10 @@ Testing Utility Reference
<script>auto_index("mxnet.test_utils");</script>
```

# Recommended Next Steps
* [Symbolic Configuration and Execution in Pictures](http://mxnet.io/api/python/symbol_in_pictures.html)
* [IO Data Loading API](io.md) for parsing and loading data
* [NDArray API](ndarray.md) for vector/matrix/tensor operations
* [KVStore API](kvstore.md) for multi-GPU and multi-host distributed training

Loading

0 comments on commit 1c49718

Please sign in to comment.