Skip to content

Commit

Permalink
Release v.0.1.4 development version
Browse files Browse the repository at this point in the history
The update to v.0.1.4. neccesitated the following changes:
  - Modified setup.py to the current version
  - Fixed broken examples from documentation tutorials
  - Updated Features in README.rst

Moreso, some changes were created to prepare the package
before v.0.1.5dev:
  - Excluded '/docs' and '/tests' directory when uploading
    to PyPI
  • Loading branch information
ljvmiranda921 committed Aug 4, 2017
1 parent b2f3c97 commit 8be5e6b
Show file tree
Hide file tree
Showing 13 changed files with 171 additions and 172 deletions.
8 changes: 7 additions & 1 deletion HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,10 @@ History
* Pre-release
* Implemented Binary PSO
* More efficient API for gbest and lbest
* Documentation and tests
* Documentation and tests

0.1.4 (2017-08-03)
~~~~~~~~~~~~~~~~~

* Well, this is embarassing...
* Added a patch to fix :code:`pip` installation
10 changes: 5 additions & 5 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,15 +13,15 @@ PySwarms
:alt: Documentation Status

.. image:: https://landscape.io/github/ljvmiranda921/pyswarms/master/landscape.svg?style=flat
:target: https://landscape.io/github/ljvmiranda921/pyswarms/master
:alt: Code Health
:target: https://landscape.io/github/ljvmiranda921/pyswarms/master
:alt: Code Health

.. image:: https://pyup.io/repos/github/ljvmiranda921/pyswarms/shield.svg
:target: https://pyup.io/repos/github/ljvmiranda921/pyswarms/
:alt: Updates
:target: https://pyup.io/repos/github/ljvmiranda921/pyswarms/
:alt: Updates

.. image:: https://img.shields.io/badge/license-MIT-blue.svg
:target: https://raw.githubusercontent.com/ljvmiranda921/pyswarms/master/LICENSE
:target: https://raw.githubusercontent.com/ljvmiranda921/pyswarms/master/LICENSE


PySwarms is a simple, Python-based, Particle Swarm Optimization (PSO) library.
Expand Down
12 changes: 5 additions & 7 deletions docs/api/pyswarms.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,13 +18,11 @@ It supports a simple skeleton to construct a customized PSO algorithm.
Optimizers
-----------

The optimizers include the actual PSO implementations for various tasks. Generally,
there are two ways to implement an optimizer from this library: (1) as an easy
off-the-shelf algorithm, and (2) as an experimental custom-made algorithm.

1. Easy off-the-shelf implementations include those that are already considered as standard in literature. This may include the classics such as global-best and local-best. Their topologies are hardcoded already, and there is no need for prior set-up in order to use. This is useful for quick-and-easy optimization problems.

2. Experimental PSO algorithms are like standard PSO algorithms but without a defined topology. Instead, an object that inherits from a :code:`Topology` class is passed to an optimizer to define swarm behavior. Although the standard PSO implementations can be done through this, this is more experimental.
The optimizers include the actual PSO implementations for various tasks.
These include easy, off-the-shelf implementations include those that are
already considered as standard in literature. This may include the
classics such as global-best and local-best. Useful for quick-and-easy
optimization problems.

.. toctree::

Expand Down
2 changes: 1 addition & 1 deletion docs/examples/train_neural_network.rst
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ parameters arbitrarily.
# Initialize swarm
options = {'c1': 0.5, 'c2': 0.3, 'w':0.9}
# Call instance of PSO with bounds argument
# Call instance of PSO
dims = (4 * 20) + (20 * 3) + 20 + 3
optimizer = ps.single.GBestPSO(n_particles=100, dims=dims, **options)
Expand Down
11 changes: 0 additions & 11 deletions docs/features.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,6 @@
Features
========

There are two ways in which optimizers are implemented in PySwarms. The first involves
quick-and-easy implementations of classic PSO algorithms. Here, the topologies (or the way
a swarm behaves) is hardcoded in the source code. This is useful for fast implementations that
doesn't need prior set-up.

The second involves a set of experimental classes where topology is not defined. Instead, one
should create an object that inherits from a :code:`Topology` class, and pass it as a parameter
in the experimental PSO classes. There are some topologies that are already implemented, but it's also possible
to define a custom-made one. This is perfect for researchers who wanted to try out various swarm
behaviours and movements.

Single-Objective Optimizers
---------------------------
Expand All @@ -28,7 +18,6 @@ functions.

* :mod:`pyswarms.single.lb` - classic local-best Particle Swarm Optimization algorithm with a ring-topology. Every particle compares itself only with its nearest-neighbours as computed by a distance metric.

* :mod:`pyswarms.single.exp` - experimental Particle Swarm Optimization algorithm.

Discrete
~~~~~~~~
Expand Down
10 changes: 5 additions & 5 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,15 @@ Welcome to PySwarms's documentation!
:alt: Documentation Status

.. image:: https://landscape.io/github/ljvmiranda921/pyswarms/master/landscape.svg?style=flat
:target: https://landscape.io/github/ljvmiranda921/pyswarms/master
:alt: Code Health
:target: https://landscape.io/github/ljvmiranda921/pyswarms/master
:alt: Code Health

.. image:: https://pyup.io/repos/github/ljvmiranda921/pyswarms/shield.svg
:target: https://pyup.io/repos/github/ljvmiranda921/pyswarms/
:alt: Updates
:target: https://pyup.io/repos/github/ljvmiranda921/pyswarms/
:alt: Updates

.. image:: https://img.shields.io/badge/license-MIT-blue.svg
:target: https://raw.githubusercontent.com/ljvmiranda921/pyswarms/master/LICENSE
:target: https://raw.githubusercontent.com/ljvmiranda921/pyswarms/master/LICENSE

PySwarms is a simple, Python-based, Particle Swarm Optimization (PSO) library.

Expand Down
132 changes: 67 additions & 65 deletions examples/.ipynb_checkpoints/basic_optimization-checkpoint.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,9 @@
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Import modules\n",
Expand Down Expand Up @@ -69,27 +71,27 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Iteration 1/1000, cost: 0.241810034912\n",
"Iteration 101/1000, cost: 1.24792343462e-07\n",
"Iteration 201/1000, cost: 3.37290413126e-11\n",
"Iteration 301/1000, cost: 1.69154015e-16\n",
"Iteration 401/1000, cost: 7.45197149386e-20\n",
"Iteration 501/1000, cost: 3.10721498746e-24\n",
"Iteration 601/1000, cost: 1.88640993989e-30\n",
"Iteration 701/1000, cost: 8.54586648864e-34\n",
"Iteration 801/1000, cost: 5.35080705362e-35\n",
"Iteration 901/1000, cost: 2.8556550659e-37\n",
"Iteration 1/1000, cost: 0.172625433638\n",
"Iteration 101/1000, cost: 9.78790435841e-08\n",
"Iteration 201/1000, cost: 1.06971376846e-10\n",
"Iteration 301/1000, cost: 8.14953145258e-16\n",
"Iteration 401/1000, cost: 3.12401159696e-20\n",
"Iteration 501/1000, cost: 1.48621699689e-22\n",
"Iteration 601/1000, cost: 3.66990623598e-25\n",
"Iteration 701/1000, cost: 9.59668571497e-31\n",
"Iteration 801/1000, cost: 1.4444452957e-34\n",
"Iteration 901/1000, cost: 6.65383019372e-38\n",
"================================\n",
"Optimization finished!\n",
"Final cost: 0.000\n",
"Best value: [1.1619287820351099e-23, -9.6008484982928834e-23]\n",
"Final cost: 0.0000\n",
"Best value: [-5.4976282085417155e-21, 5.2520478160189662e-22]\n",
"\n"
]
}
],
"source": [
"# Set-up hyperparameters\n",
"options = {'c1': 0.5, 'c2': 0.3, 'm':0.9}\n",
"options = {'c1': 0.5, 'c2': 0.3, 'w':0.9}\n",
"\n",
"# Call instance of PSO\n",
"gbest_pso = ps.single.GBestPSO(n_particles=10, dims=2, **options)\n",
Expand Down Expand Up @@ -123,30 +125,30 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Iteration 1/1000, cost: 0.04003185568\n",
"Iteration 101/1000, cost: 6.3794607757e-06\n",
"Iteration 201/1000, cost: 7.09436163329e-12\n",
"Iteration 301/1000, cost: 1.5461139723e-15\n",
"Iteration 401/1000, cost: 1.19646365726e-16\n",
"Iteration 501/1000, cost: 1.69249151459e-19\n",
"Iteration 601/1000, cost: 4.50338461028e-25\n",
"Iteration 701/1000, cost: 5.19268925253e-28\n",
"Iteration 801/1000, cost: 1.56302357735e-30\n",
"Iteration 901/1000, cost: 9.65270491034e-36\n",
"Iteration 1/1000, cost: 0.0242054338711\n",
"Iteration 101/1000, cost: 2.97581624567e-08\n",
"Iteration 201/1000, cost: 7.11602482139e-10\n",
"Iteration 301/1000, cost: 2.03603327891e-13\n",
"Iteration 401/1000, cost: 2.06532397175e-17\n",
"Iteration 501/1000, cost: 1.91037623424e-22\n",
"Iteration 601/1000, cost: 1.6553317776e-27\n",
"Iteration 701/1000, cost: 2.01430762053e-33\n",
"Iteration 801/1000, cost: 9.56525068774e-37\n",
"Iteration 901/1000, cost: 4.88187595682e-37\n",
"================================\n",
"Optimization finished!\n",
"Final cost: 0.000\n",
"Best value: [1.083763152840743e-19, 1.1881341950447718e-19]\n",
"Final cost: 5.0000\n",
"Best value: [3.2487681005019184e-19, -5.7752296538288309e-20]\n",
"\n"
]
}
],
"source": [
"# Set-up hyperparameters\n",
"options = {'c1': 0.5, 'c2': 0.3, 'm':0.9}\n",
"options = {'c1': 0.5, 'c2': 0.3, 'w':0.9, 'k': 2, 'p': 2}\n",
"\n",
"# Call instance of PSO\n",
"lbest_pso = ps.single.LBestPSO(n_particles=10, dims=2, k=2,p=2, **options)\n",
"lbest_pso = ps.single.LBestPSO(n_particles=10, dims=2, **options)\n",
"\n",
"# Perform optimization\n",
"cost, pos = lbest_pso.optimize(fx.sphere_func, print_step=100, iters=1000, verbose=3)"
Expand Down Expand Up @@ -180,55 +182,55 @@
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 5,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# Create bounds\n",
"max_bound = 5.12 * np.ones(2)\n",
"min_bound = - max_bound\n",
"bounds = (min_bound, max_bound)"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Help on function rastrigin_func in module pyswarms.utils.functions.single_obj:\n",
"\n",
"rastrigin_func(x)\n",
" Rastrigin objective function.\n",
" \n",
" Has a global minimum at :code:`f(0,0,...,0)` with a search\n",
" domain of :code:`[-5.12, 5.12]`\n",
" \n",
" Parameters\n",
" ----------\n",
" x : numpy.ndarray \n",
" set of inputs of shape :code:`(n_particles, dims)`\n",
" \n",
" Returns\n",
" -------\n",
" numpy.ndarray \n",
" computed cost of size :code:`(n_particles, )`\n",
" \n",
" Raises\n",
" ------\n",
" ValueError\n",
" When the input is out of bounds with respect to the function\n",
" domain\n",
"Iteration 1/1000, cost: 11.3842045778\n",
"Iteration 101/1000, cost: 0.00601961330037\n",
"Iteration 201/1000, cost: 1.06143502876e-08\n",
"Iteration 301/1000, cost: 1.12052589429e-11\n",
"Iteration 401/1000, cost: 7.1054273576e-15\n",
"Iteration 501/1000, cost: 0.0\n",
"Iteration 601/1000, cost: 0.0\n",
"Iteration 701/1000, cost: 0.0\n",
"Iteration 801/1000, cost: 0.0\n",
"Iteration 901/1000, cost: 0.0\n",
"================================\n",
"Optimization finished!\n",
"Final cost: 0.0000\n",
"Best value: [-3.5535227100370778e-09, -8.3294662576708084e-10]\n",
"\n"
]
}
],
"source": [
"# Create bounds\n",
"max_bound = 5.12 * np.ones(2)\n",
"min_bound = - max_bound\n",
"bounds = tuple(max_bound, min_bound)"
"# Initialize swarm\n",
"options = {'c1': 0.5, 'c2': 0.3, 'w':0.9}\n",
"\n",
"# Call instance of PSO with bounds argument\n",
"optimizer = ps.single.GBestPSO(n_particles=10, dims=2, bounds=bounds, **options)\n",
"\n",
"# Perform optimization\n",
"cost, pos = optimizer.optimize(fx.rastrigin_func, print_step=100, iters=1000, verbose=3)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
36 changes: 18 additions & 18 deletions examples/.ipynb_checkpoints/train_neural_network-checkpoint.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -215,29 +215,29 @@
"name": "stdout",
"output_type": "stream",
"text": [
"Iteration 1/1000, cost: 1.11338932053\n",
"Iteration 101/1000, cost: 0.0541135752532\n",
"Iteration 201/1000, cost: 0.0468046270747\n",
"Iteration 301/1000, cost: 0.0434828849533\n",
"Iteration 401/1000, cost: 0.0358833340106\n",
"Iteration 501/1000, cost: 0.0312474981647\n",
"Iteration 601/1000, cost: 0.0150869267541\n",
"Iteration 701/1000, cost: 0.01267166403\n",
"Iteration 801/1000, cost: 0.00632312205821\n",
"Iteration 901/1000, cost: 0.00194080306565\n",
"Iteration 1/1000, cost: 1.10288192551\n",
"Iteration 101/1000, cost: 0.0720619629541\n",
"Iteration 201/1000, cost: 0.0428161767369\n",
"Iteration 301/1000, cost: 0.0341626636208\n",
"Iteration 401/1000, cost: 0.0283418687066\n",
"Iteration 501/1000, cost: 0.023736130164\n",
"Iteration 601/1000, cost: 0.0187020687472\n",
"Iteration 701/1000, cost: 0.0175513666795\n",
"Iteration 801/1000, cost: 0.0155400739943\n",
"Iteration 901/1000, cost: 0.0142253459508\n",
"================================\n",
"Optimization finished!\n",
"Final cost: 0.0015\n",
"Best value: -0.356506 0.441392 -0.605476 0.620517 -0.156904 0.206396 ...\n",
"Final cost: 0.0139\n",
"Best value: 0.931199 -2.853534 0.348705 -4.812333 -0.258052 -0.490002 ...\n",
"\n"
]
}
],
"source": [
"# Initialize swarm\n",
"options = {'c1': 0.5, 'c2': 0.3, 'm':0.9}\n",
"options = {'c1': 0.5, 'c2': 0.3, 'w':0.9}\n",
"\n",
"# Call instance of PSO with bounds argument\n",
"# Call instance of PSO\n",
"dims = (4 * 20) + (20 * 3) + 20 + 3 \n",
"optimizer = ps.single.GBestPSO(n_particles=100, dims=dims, **options)\n",
"\n",
Expand All @@ -257,7 +257,7 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 7,
"metadata": {
"collapsed": true
},
Expand Down Expand Up @@ -305,16 +305,16 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"1.0"
"0.99333333333333329"
]
},
"execution_count": 13,
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
Expand Down
Loading

0 comments on commit 8be5e6b

Please sign in to comment.