-
-
Notifications
You must be signed in to change notification settings - Fork 608
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move doc sections to "guide" + "reference" #2115
Changes from 17 commits
25d25a4
4e56b71
6bd7df7
082fe4b
7b4cfb6
e8ca198
02a3635
e0b62c0
17c49cb
4f6015c
9d5b5cc
d7d07ec
b222b04
81fb2a3
cbdc154
7b48d24
a0d90b8
9529724
d4c177d
e1e83c4
821d8f2
10a50af
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||
---|---|---|---|---|---|---|
|
@@ -5,11 +5,11 @@ globally providing a rich and consistent user experience. | |||||
|
||||||
This is a non-exhaustive list of Julia packages, nicely complementing `Flux` in typical | ||||||
machine learning and deep learning workflows. To add your project please send a [PR](https://github.com/FluxML/Flux.jl/pulls). | ||||||
See also academic work citing Flux or Zygote. | ||||||
See also academic work [citing Flux](https://scholar.google.com/scholar?cites=9731162218836700005&hl=en) or [citing Zygote](https://scholar.google.com/scholar?cites=11943854577624257878&hl=en). | ||||||
|
||||||
## Flux models | ||||||
|
||||||
Packages that are actual `Flux` models but are not available directly through the `Flux` package. | ||||||
- Flux's [model-zoo](https://github.com/FluxML/model-zoo) contains examples from many domains. | ||||||
|
||||||
### Computer vision | ||||||
|
||||||
|
@@ -38,6 +38,8 @@ Packages that are actual `Flux` models but are not available directly through th | |||||
|
||||||
- [FluxArchitectures.jl](https://github.com/sdobber/FluxArchitectures.jl) is a collection of advanced network architectures for time series forecasting. | ||||||
|
||||||
--- | ||||||
|
||||||
## Tools closely associated with Flux | ||||||
|
||||||
Utility tools you're unlikely to have met if you never used Flux! | ||||||
|
@@ -64,9 +66,10 @@ Tools to put data into the right order for creating a model. | |||||
|
||||||
### Parameters | ||||||
|
||||||
- [Parameters.jl](https://github.com/mauro3/Parameters.jl) types with default field values, keyword constructors and (un-)pack macros. | ||||||
- [ParameterSchedulers.jl](https://github.com/darsnack/ParameterSchedulers.jl) standard scheduling policies for machine learning. | ||||||
|
||||||
--- | ||||||
|
||||||
## Differentiable programming | ||||||
|
||||||
Packages based on differentiable programming but not necessarily related to Machine Learning. | ||||||
|
@@ -90,6 +93,7 @@ Packages based on differentiable programming but not necessarily related to Mach | |||||
|
||||||
- [OnlineStats.jl](https://github.com/joshday/OnlineStats.jl) provides single-pass algorithms for statistics. | ||||||
|
||||||
--- | ||||||
|
||||||
## Useful miscellaneous packages | ||||||
|
||||||
|
@@ -104,8 +108,22 @@ Some useful and random packages! | |||||
- [ProgressMeter.jl](https://github.com/timholy/ProgressMeter.jl) progress meters for long-running computations. | ||||||
- [TensorBoardLogger.jl](https://github.com/PhilipVinc/TensorBoardLogger.jl) easy peasy logging to [tensorboard](https://www.tensorflow.org/tensorboard) in Julia | ||||||
- [ArgParse.jl](https://github.com/carlobaldassi/ArgParse.jl) is a package for parsing command-line arguments to Julia programs. | ||||||
- [Parameters.jl](https://github.com/mauro3/Parameters.jl) types with default field values, keyword constructors and (un-)pack macros. | ||||||
- [BSON.jl](https://github.com/JuliaIO/BSON.jl) is a package for working with the Binary JSON serialisation format. | ||||||
- [DataFrames.jl](https://github.com/JuliaData/DataFrames.jl) in-memory tabular data in Julia. | ||||||
- [DrWatson.jl](https://github.com/JuliaDynamics/DrWatson.jl) is a scientific project assistant software. | ||||||
|
||||||
This tight integration among Julia packages is shown in some of the examples in the [model-zoo](https://github.com/FluxML/model-zoo) repository. | ||||||
|
||||||
--- | ||||||
|
||||||
## Alternatives to Flux | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Small changes to the links page. The biggest is this section on SimpleChains / KNet / Lux. See what you think? |
||||||
|
||||||
Julia has several other libraries for making neural networks. | ||||||
|
||||||
* [SimpleChains.jl](https://github.com/PumasAI/SimpleChains.jl) is focused on making small, simple, CPU-based, neural networks fast. Uses [LoopVectorization.jl](https://github.com/JuliaSIMD/LoopVectorization.jl). (Was `FastChain` in DiffEqFlux.jl) | ||||||
|
||||||
* [Knet.jl](https://github.com/denizyuret/Knet.jl) is a neural network library built around [AutoGrad.jl](https://github.com/denizyuret/AutoGrad.jl), with beautiful documentation. | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
Is "with beautiful documentation" from a tagline somewhere? Trying to think of a better point of comparison for this bullet. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It is really nice, https://denizyuret.github.io/Knet.jl/latest/mlp/ But I agree this may read as a dismissal. The entire section is a bit of a minefield... There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Trying to find a way to express that KNet takes an equally valid but different approach to the ML library design problem. The README and https://denizyuret.github.io/Knet.jl/latest/tutorial/#Philosophy don't give us much in the way of talking points to succinctly contrast against Flux, however. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, this seems aimed more at tensorflow et al, parallel to pytorch / Flux / etc. |
||||||
|
||||||
* [Lux.jl](https://github.com/avik-pal/Lux.jl) (earlier ExplicitFluxLayers.jl) shares much of the design, use-case, and NNlib.jl / Optimisers.jl back-end of Flux. But instead of encapsulating all parameters within the model structure, it separates this into 3 components: a model, a tree of parameters, and a tree of model states. | ||||||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,29 +2,25 @@ | |
|
||
Flux is a library for machine learning. It comes "batteries-included" with many useful tools built in, but also lets you use the full power of the Julia language where you need it. We follow a few key principles: | ||
|
||
* **Doing the obvious thing**. Flux has relatively few explicit APIs for features like regularisation or embeddings. Instead, writing down the mathematical form will work – and be fast. | ||
* **Extensible by default**. Flux is written to be highly extensible and flexible while being performant. Extending Flux is as simple as using your own code as part of the model you want - it is all [high-level Julia code](https://github.com/FluxML/Flux.jl/blob/ec16a2c77dbf6ab8b92b0eecd11661be7a62feef/src/layers/recurrent.jl#L131). When in doubt, it’s well worth looking at [the source](https://github.com/FluxML/Flux.jl/tree/master/src). If you need something different, you can easily roll your own. | ||
* **Play nicely with others**. Flux works well with Julia libraries from [images](https://github.com/JuliaImages/Images.jl) to [differential equation solvers](https://github.com/SciML/DifferentialEquations.jl), so you can easily build complex data processing pipelines that integrate Flux models. | ||
* **Doing the obvious thing**. Flux has relatively few explicit APIs. Instead, writing down the mathematical form will work – and be fast. | ||
* **Extensible by default**. Flux is written to be highly flexible while being performant. Extending Flux is as simple as using your own code as part of the model you want - it is all [high-level Julia code](https://github.com/FluxML/Flux.jl/tree/master/src). | ||
* **Play nicely with others**. Flux works well with unrelated Julia libraries from [images](https://github.com/JuliaImages/Images.jl) to [differential equation solvers](https://github.com/SciML/DifferentialEquations.jl), rather than duplicating them. | ||
|
||
## Installation | ||
### Installation | ||
|
||
Download [Julia 1.6](https://julialang.org/downloads/) or later, preferably the current stable release. You can add Flux using Julia's package manager, by typing `] add Flux` in the Julia prompt. | ||
Download [Julia 1.6](https://julialang.org/downloads/) or later, preferably the current stable release. You can add Flux using Julia's package manager, by typing `] add Flux` in the Julia prompt. This will automatically install several other packages, including [CUDA.jl](https://github.com/JuliaGPU/CUDA.jl) which supports Nvidia GPUs. | ||
mcabbott marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
This will automatically install several other packages, including [CUDA.jl](https://github.com/JuliaGPU/CUDA.jl) which supports Nvidia GPUs. To directly access some of its functionality, you may want to add `] add CUDA` too. The page on [GPU support](gpu.md) has more details. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I tried to delete lots of text from the welcome page. It was pretty verbose. |
||
### Learning Flux | ||
|
||
Other closely associated packages, also installed automatically, include [Zygote](https://github.com/FluxML/Zygote.jl), [Optimisers](https://github.com/FluxML/Optimisers.jl), [NNlib](https://github.com/FluxML/NNlib.jl), [Functors](https://github.com/FluxML/Functors.jl) and [MLUtils](https://github.com/JuliaML/MLUtils.jl). | ||
The **[quick start](@ref man-quickstart)** page trains a simple neural network. | ||
|
||
## Learning Flux | ||
This rest of the **guide** provides a from-scratch introduction to Flux's take on models and how they work, starting with [fitting a line](@ref man-overview). Once you understand these docs, congratulations, you also understand [Flux's source code](https://github.com/FluxML/Flux.jl), which is intended to be concise, legible and a good reference for more advanced concepts. | ||
|
||
The [quick start](@ref man-quickstart) page trains a simple neural network. | ||
There are some **tutorials** about building particular models. The **[model zoo](https://github.com/FluxML/model-zoo/)** has starting points for many other common ones. And finally, the **[ecosystem page](ecosystem.md)** lists packages which define Flux models. | ||
|
||
This rest of this documentation provides a from-scratch introduction to Flux's take on models and how they work, starting with [fitting a line](@ref man-overview). Once you understand these docs, congratulations, you also understand [Flux's source code](https://github.com/FluxML/Flux.jl), which is intended to be concise, legible and a good reference for more advanced concepts. | ||
The **reference** section contains API listings, including some companion packages: [Zygote](https://github.com/FluxML/Zygote.jl) (automatic differentiation), [Optimisers](https://github.com/FluxML/Optimisers.jl) (training), [NNlib](https://github.com/FluxML/NNlib.jl) (misc functions) and more. | ||
mcabbott marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
Sections with 📚 contain API listings. The same text is avalable at the Julia prompt, by typing for example `?gpu`. | ||
|
||
If you just want to get started writing models, the [model zoo](https://github.com/FluxML/model-zoo/) gives good starting points for many common ones. | ||
|
||
## Community | ||
### Community | ||
|
||
Everyone is welcome to join our community on the [Julia discourse forum](https://discourse.julialang.org/), or the [slack chat](https://discourse.julialang.org/t/announcing-a-julia-slack/4866) (channel #machine-learning). If you have questions or issues we'll try to help you out. | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -213,13 +213,13 @@ m(5) # => 26 | |
|
||
## Layer Helpers | ||
|
||
There is still one problem with this `Affine` layer, that Flux does not know to look inside it. This means that [`Flux.train!`](@ref) won't see its parameters, nor will [`gpu`](@ref) be able to move them to your GPU. These features are enabled by the `@functor` macro: | ||
There is still one problem with this `Affine` layer, that Flux does not know to look inside it. This means that [`Flux.train!`](@ref) won't see its parameters, nor will [`gpu`](@ref) be able to move them to your GPU. These features are enabled by the [`@functor`](@ref Functors.@functor) macro: | ||
|
||
``` | ||
Flux.@functor Affine | ||
``` | ||
|
||
Finally, most Flux layers make bias optional, and allow you to supply the function used for generating random weights. We can easily add these refinements to the `Affine` layer as follows: | ||
Finally, most Flux layers make bias optional, and allow you to supply the function used for generating random weights. We can easily add these refinements to the `Affine` layer as follows, using the helper function [`create_bias`](@ref Flux.create_bias): | ||
|
||
``` | ||
function Affine((in, out)::Pair; bias=true, init=Flux.randn32) | ||
|
@@ -230,7 +230,3 @@ end | |
|
||
Affine(3 => 1, bias=false, init=ones) |> gpu | ||
``` | ||
|
||
```@docs | ||
Functors.@functor | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This macro has its (long) docstring in the Functors page, so it need not appear here (and in fact Documenter complains about it appearing twice) |
||
``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having example functions right in the heading feels a little strange. Could we instead use adapt the tagline of each page? e.g. "Neural Network primitives - NNlib.jl".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't love them, but they are compact, and I didn't manage to think of a better way (last time around). I mean I know what "primitives" means but it's not so obvious... what kind of utils is MLUtils, what on earth is Functors in less than a paragraph?
(They aren't new, and the page titles are longer explanations, which don't fit nicely in the sidebar)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My thought was to take inspiration from the shortened titles, something like so:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Last commit does this, see what you think (PR before + after):
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks pretty good!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shall we do this, then refine as necc?