Skip to content

Commit

Permalink
[MXNET-624] Clojure Package Api Docs for the Website (apache#11574)
Browse files Browse the repository at this point in the history
Clojure Package Api Docs
  • Loading branch information
gigasquid authored and nswamy committed Jul 10, 2018
1 parent 0bec94b commit 7a6e8de
Show file tree
Hide file tree
Showing 16 changed files with 750 additions and 97 deletions.
3 changes: 1 addition & 2 deletions contrib/clojure-package/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,8 +162,6 @@ Do `lein run` for the cpu version or `lein run :gpu` for gpu.

To generate api docs, run `lein codox`. The html docs will be generated in the target/docs directory.

_Note: There is an error thrown in the generated code due to some loading issues, but the docs are all still there._

## Code Coverage

To run the Code Coverage tool. Run `lein cloverage`.
Expand Down Expand Up @@ -235,3 +233,4 @@ Special thanks to people that provided testing and feedback to make this possibl
- Burin Choomnuan
- Avram Aelony
- Jim Dunn
- Kovas Boguta
5 changes: 0 additions & 5 deletions contrib/clojure-package/doc/getting-started/Archlinux.md

This file was deleted.

8 changes: 0 additions & 8 deletions contrib/clojure-package/doc/getting-started/Ubuntu.md

This file was deleted.

12 changes: 0 additions & 12 deletions contrib/clojure-package/doc/intro.md

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@

(def arr (ndarray/ones [2 3]))

arr ;=> #object[ml.dmlc.mxnet.NDArray 0x482401ab "ml.dmlc.mxnet.NDArray@d8902656"]
arr ;=> #object[org.apache.mxnet.NDArray 0x597d72e "org.apache.mxnet.NDArray@e35c3ba9"]

(ndarray/shape-vec arr) ;=> [2 3]

Expand Down
41 changes: 23 additions & 18 deletions contrib/clojure-package/examples/tutorial/src/tutorial/module.clj
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@
act2 (sym/activation "relu2" {:data fc2 :act-type "relu"})
fc3 (sym/fully-connected "fc3" {:data act2 :num-hidden 10})
out (sym/softmax-output "softmax" {:data fc3})]
out) ;=> #object[ml.dmlc.mxnet.Symbol 0x2c7b036b "ml.dmlc.mxnet.Symbol@2c7b036b"]
out) ;=>#object[org.apache.mxnet.Symbol 0x1f43a406 "org.apache.mxnet.Symbol@1f43a406"]

;; You can also use as-> for easier threading

Expand Down Expand Up @@ -94,17 +94,17 @@
;;Modules provide high-level APIs for training, predicting, and evaluating. To fit a module, call the `fit` function with some DataIters:

(def mod (m/fit (m/module out) {:train-data train-data :eval-data test-data :num-epoch 1}))
;; INFO ml.dmlc.mxnet.module.BaseModule: Epoch[0] Train-accuracy=0.12521666
;; INFO ml.dmlc.mxnet.module.BaseModule: Epoch[0] Time cost=7863
;; INFO ml.dmlc.mxnet.module.BaseModule: Epoch[0] Validation-accuracy=0.2227
;; Epoch 0 Train- [accuracy 0.12521666]
;; Epoch 0 Time cost- 8392
;; Epoch 0 Validation- [accuracy 0.2227]


;; You can pass in batch-end callbacks using batch-end-callback and epoch-end callbacks using epoch-end-callback in the `fit-params`. You can also set parameters using functions like in the fit-params like optimizer and eval-metric. To learn more about the fit-params, see the fit-param function options. To predict with a module, call `predict` with a DataIter:

(def results (m/predict mod {:eval-data test-data}))
(first results) ;=> #object[ml.dmlc.mxnet.NDArray 0x270236e5 "ml.dmlc.mxnet.NDArray@9180e594"]
(first results) ;=>#object[org.apache.mxnet.NDArray 0x3540b6d3 "org.apache.mxnet.NDArray@a48686ec"]

(first (ndarray/->vec (first results))) ;=> 0.099454574
(first (ndarray/->vec (first results))) ;=>0.08261358

;;The module collects and returns all of the prediction results. For more details about the format of the return values, see the documentation for the `predict` function.

Expand Down Expand Up @@ -136,15 +136,19 @@
]))
(m/save-checkpoint mod {:prefix save-prefix :epoch epoch-num :save-opt-states true})))

;; INFO ml.dmlc.mxnet.module.Module: Saved checkpoint to my-model-0000.params
;; INFO ml.dmlc.mxnet.module.Module: Saved optimizer state to my-model-0000.states
;; INFO ml.dmlc.mxnet.module.Module: Saved checkpoint to my-model-0001.params
;; INFO ml.dmlc.mxnet.module.Module: Saved optimizer state to my-model-0001.states
;; INFO org.apache.mxnet.module.Module: Saved checkpoint to my-model-0000.params
;; INFO org.apache.mxnet.module.Module: Saved optimizer state to my-model-0000.states
;; INFO org.apache.mxnet.module.Module: Saved checkpoint to my-model-0001.params
;; INFO org.apache.mxnet.module.Module: Saved optimizer state to my-model-0001.states
;; INFO org.apache.mxnet.module.Module: Saved checkpoint to my-model-0002.params
;; INFO org.apache.mxnet.module.Module: Saved optimizer state to my-model-0002.states


;;To load the saved module parameters, call the `load-checkpoint` function:

(def new-mod (m/load-checkpoint {:prefix "my-model" :epoch 1 :load-optimizer-states true}))
;=> #object[ml.dmlc.mxnet.module.Module 0x352c8590 "ml.dmlc.mxnet.module.Module@352c8590"]

new-mod ;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.module.Module@5304d0f4"]

;;To initialize parameters, Bind the symbols to construct executors first with bind function. Then, initialize the parameters and auxiliary states by calling `init-params` function.

Expand All @@ -160,23 +164,24 @@

;; {:arg-params
;; {"fc3_bias"
;; #object[ml.dmlc.mxnet.NDArray 0x4fcda4a0 "ml.dmlc.mxnet.NDArray@70276e89"],
;; #object[org.apache.mxnet.NDArray 0x39adc3b0 "org.apache.mxnet.NDArray@49caf426"],
;; "fc2_weight"
;; #object[ml.dmlc.mxnet.NDArray 0x33651972 "ml.dmlc.mxnet.NDArray@b2a396eb"],
;; #object[org.apache.mxnet.NDArray 0x25baf623 "org.apache.mxnet.NDArray@a6c8f9ac"],
;; "fc1_bias"
;; #object[ml.dmlc.mxnet.NDArray 0x3ad02326 "ml.dmlc.mxnet.NDArray@b4110d31"],
;; #object[org.apache.mxnet.NDArray 0x6e089973 "org.apache.mxnet.NDArray@9f91d6eb"],
;; "fc3_weight"
;; #object[ml.dmlc.mxnet.NDArray 0x4c088d9b "ml.dmlc.mxnet.NDArray@19399ebd"],
;; #object[org.apache.mxnet.NDArray 0x756fd109 "org.apache.mxnet.NDArray@2dd0fe3c"],
;; "fc2_bias"
;; #object[ml.dmlc.mxnet.NDArray 0x3cca519d "ml.dmlc.mxnet.NDArray@61012c"],
;; #object[org.apache.mxnet.NDArray 0x1dc69c8b "org.apache.mxnet.NDArray@d128f73d"],
;; "fc1_weight"
;; #object[ml.dmlc.mxnet.NDArray 0xea5d61c "ml.dmlc.mxnet.NDArray@b16841b4"]},
;; #object[org.apache.mxnet.NDArray 0x20abc769 "org.apache.mxnet.NDArray@b8e1c5e8"]},
;; :aux-params {}}


;;To assign parameter and aux state values, use `set-params` function.

(m/set-params new-mod {:arg-params (m/arg-params new-mod) :aux-params (m/aux-params new-mod)})
;=>#object[ml.dmlc.mxnet.module.Module 0x11f34e1 "ml.dmlc.mxnet.module.Module@11f34e1"]
;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.module.Module@5304d0f4"]

;;To resume training from a saved checkpoint, instead of calling `set-params`, directly call `fit`, passing the loaded parameters, so that `fit` knows to start from those parameters instead of initializing randomly:

Expand Down
15 changes: 13 additions & 2 deletions contrib/clojure-package/examples/tutorial/src/tutorial/symbol.clj
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,17 @@
(sym/fully-connected "fc2" {:data data :num-hidden 64})
(sym/softmax-output "out" {:data data}))

net ;=> #object[ml.dmlc.mxnet.Symbol 0x38c72806 "ml.dmlc.mxnet.Symbol@38c72806"]
net ;=> #object[org.apache.mxnet.Symbol 0x5c78c8c2 "org.apache.mxnet.Symbol@5c78c8c2"]


;;The basic arithmetic operators (plus, minus, div, multiplication)

;;The following example creates a computation graph that adds two inputs together.

(def a (sym/variable "a"))
(def b (sym/variable "b"))
(def c (sym/+ a b))


;; Each symbol takes a (unique) string name. NDArray and Symbol both represent a single tensor. Operators represent the computation between tensors. Operators take symbol (or NDArray) as inputs and might also additionally accept other hyperparameters such as the number of hidden neurons (num_hidden) or the activation type (act_type) and produce the output.

Expand All @@ -63,7 +73,8 @@ net ;=> #object[ml.dmlc.mxnet.Symbol 0x38c72806 "ml.dmlc.mxnet.Symbol@38c72806"]
(def net (sym/fully-connected "fc1" {:data net :weight w :num-hidden 128}))

(sym/list-arguments net)
;=> ["data" "fc1_weight" "fc1_bias" "fc2_weight" "fc2_bias" "out_label" "myweight" "fc1_bias"]
;=> ["data" "fc1_weight" "fc1_bias" "fc2_weight" "fc2_bias" "out_label" "myweight" "fc1_bias"]


;;In the above example, FullyConnected layer has 3 inputs: data, weight, bias. When any input is not specified, a variable will be automatically generated for it.

Expand Down
49 changes: 0 additions & 49 deletions contrib/clojure-package/examples/visualization/testviz

This file was deleted.

Binary file not shown.
33 changes: 33 additions & 0 deletions docs/api/clojure/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# MXNet - Clojure API
MXNet supports the Clojure programming language. The MXNet Clojure package brings flexible and efficient GPU
computing and state-of-art deep learning to Clojure. It enables you to write seamless tensor/matrix computation with multiple GPUs in Clojure. It also lets you construct and customize the state-of-art deep learning models in Clojure, and apply them to tasks, such as image classification and data science challenges.

See the [MXNet Clojure API Documentation](docs/index.html) for detailed API information.


## Tensor and Matrix Computations
You can perform tensor or matrix computation in pure Clojure:

```clojure
(def arr (ndarray/ones [2 3]))

arr ;=> #object[org.apache.mxnet.NDArray 0x597d72e "org.apache.mxnet.NDArray@e35c3ba9"]

(ndarray/shape-vec arr) ;=> [2 3]

(-> (ndarray/* arr 2)
(ndarray/->vec)) ;=> [2.0 2.0 2.0 2.0 2.0 2.0]

(ndarray/shape-vec (ndarray/* arr 2)) ;=> [2 3]

```

## Clojure API Tutorials
* [Module API is a flexible high-level interface for training neural networks.](module.html)
* [Symbolic API performs operations on NDArrays to assemble neural networks from layers.](symbol.html)
* [NDArray API performs vector/matrix/tensor operations.](ndarray.html)
* [KVStore API performs multi-GPU and multi-host distributed training.](kvstore.html)


## Related Resources
* [MXNet Clojure API Documentation](docs/index.html)
86 changes: 86 additions & 0 deletions docs/api/clojure/kvstore.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# KVStore API

Topics:

* [Basic Push and Pull](#basic-push-and-pull)
* [List Key-Value Pairs](#list-key-value-pairs)
* [API Reference](http://mxnet.incubator.apache.org/api/clojure/docs/org.apache.clojure-mxnet.kvstore.html)

To follow along with this documentation, you can use this namespace to with the needed requires:

```clojure
(ns docs.kvstore
(:require [org.apache.clojure-mxnet.kvstore :as kvstore]
[org.apache.clojure-mxnet.ndarray :as ndarray]
[org.apache.clojure-mxnet.context :as context]))
```

## Basic Push and Pull

Provides basic operation over multiple devices (GPUs) on a single device.

### Initialization

Let's consider a simple example. It initializes
a (`int`, `NDArray`) pair into the store, and then pulls the value out.

```clojure
(def kv (kvstore/create "local")) ;; create a local kvstore
(def shape [2 3])
;;; init the kvstore with a vector of keys (strings) and ndarrays
(kvstore/init kv ["3"] [(ndarray/* (ndarray/ones shape) 2)])
(def a (ndarray/zeros shape))
(kvstore/pull kv ["3"] [a])
(ndarray/->vec a) ;=> [2.0 2.0 2.0 2.0 2.0 2.0]
```

### Push, Aggregation, and Updater

For any key that's been initialized, you can push a new value with the same shape to the key, as follows:

```clojure
(kvstore/push kv ["3"] [(ndarray/* (ndarray/ones shape) 8)])
(kvstore/pull kv ["3"] [a])
(ndarray/->vec a);=>[8.0 8.0 8.0 8.0 8.0 8.0]
```

The data that you want to push can be stored on any device. Furthermore, you can push multiple
values into the same key, where KVStore first sums all of these
values, and then pushes the aggregated value, as follows (Here we use multiple cpus):

```clojure
(def cpus [(context/cpu 0) (context/cpu 1) (context/cpu 2)])
(def b [(ndarray/ones shape {:ctx (nth cpus 0)})
(ndarray/ones shape {:ctx (nth cpus 1)})
(ndarray/ones shape {:ctx (nth cpus 2)})])
(kvstore/push kv ["3" "3" "3"] b)
(kvstore/pull kv "3" a)
(ndarray/->vec a) ;=> [3.0 3.0 3.0 3.0 3.0 3.0]
```


### Pull

You've already seen how to pull a single key-value pair. Similar to the way that you use the push command, you can
pull the value into several devices with a single call.

```clojure
(def b [(ndarray/ones shape {:ctx (context/cpu 0)})
(ndarray/ones shape {:ctx (context/cpu 1)})])
(kvstore/pull kv ["3" "3"] b)
(map ndarray/->vec b) ;=> ([3.0 3.0 3.0 3.0 3.0 3.0] [3.0 3.0 3.0 3.0 3.0 3.0])
```

## List Key-Value Pairs

All of the operations that we've discussed so far are performed on a single key. KVStore also provides
the interface for generating a list of key-value pairs. For a single device, use the following:

```clojure
(def ks ["5" "7" "9"])
(kvstore/init kv ks [(ndarray/ones shape) (ndarray/ones shape) (ndarray/ones shape)])
(kvstore/push kv ks [(ndarray/ones shape) (ndarray/ones shape) (ndarray/ones shape)])
(def b [(ndarray/zeros shape) (ndarray/zeros shape) (ndarray/zeros shape)])
(kvstore/pull kv ks b)
(map ndarray/->vec b);=> ([1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0])
```
Loading

0 comments on commit 7a6e8de

Please sign in to comment.