Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
2fb97a1
add adam optimizer
YigitElma Dec 2, 2025
cf12086
Merge branch 'master' into yge/adam
YigitElma Dec 16, 2025
3ad6b89
unify the api for generic sgd type optimizers
YigitElma Dec 18, 2025
79faa1a
update docs
YigitElma Dec 18, 2025
9ea9e97
add RMSProp too
YigitElma Dec 18, 2025
6cc796b
update changelog
YigitElma Dec 18, 2025
81a3bd4
update docstring, update the equation in particles to latex
YigitElma Dec 18, 2025
731df93
update docstring
YigitElma Dec 19, 2025
53dccec
Merge branch 'master' into yge/adam
dpanici Dec 22, 2025
16f175e
Merge remote-tracking branch 'origin/master' into yge/adam
YigitElma Jan 6, 2026
ae36974
fix x_scale, rename back to sgd
YigitElma Jan 7, 2026
9eb3be2
initial attempt to add optax optimizers
YigitElma Jan 7, 2026
1cd3e98
add a test
YigitElma Jan 7, 2026
0e920b0
add a test
YigitElma Jan 7, 2026
0f60b3c
update docs for default value
YigitElma Jan 7, 2026
b9acecf
add a test to keep optax optimizers list up to date
YigitElma Jan 7, 2026
8a1f301
add key manually
YigitElma Jan 7, 2026
031aba3
update changelog
YigitElma Jan 7, 2026
680fef8
fix polyak_sgd case
YigitElma Jan 7, 2026
2305426
remove adam and rmsprop, switch back to unicode for code readability
YigitElma Jan 7, 2026
1644f29
clean up
YigitElma Jan 7, 2026
04b9ecf
Merge remote-tracking branch 'origin/master' into yge/adam
YigitElma Jan 7, 2026
6b25bbf
remove redundant tests
YigitElma Jan 7, 2026
20ffc0a
add support for custom optax optimizers
YigitElma Jan 7, 2026
c4c7aab
update changelog
YigitElma Jan 7, 2026
50bd906
minor formatting
YigitElma Jan 8, 2026
17a0a1f
address Rory's comments
YigitElma Jan 28, 2026
258d822
Merge branch 'master' into yge/adam
YigitElma Jan 28, 2026
042b686
deprecate sgd
YigitElma Jan 29, 2026
747b346
update changelog
YigitElma Jan 29, 2026
5843eaa
fix typo
YigitElma Jan 29, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@ or if multiple things are being optimized, `x_scale` can be a list of dict, one
- Adds new option `x_scale='ess'` to use exponential spectral scaling from (Jang 2025) which has been shown to improve performance and robustness as an
alternative to fourier continuation methods.
- Adds ``"scipy-l-bfgs-b"`` optimizer option as a wrapper to scipy's ``"l-bfgs-b"`` method.
- The ``x_scale`` parameter can now be used with stochastic gradient descent type optimizers.
- Adds wrappers for ``optax`` optimizers. They can be used by prepending ``'optax-'`` to the name of the optimizer (i.e. ``optax-adam``). Additional arguments to the optimizer such as `learning_rate` can be pass via ``options = {'optax-options': {'learning_rate': 0.01}}``. Even a custom ``optax`` optimizer can be used by specifying the method as ``'optax-custom'`` and passing the ``optax`` optimizer via the ``'update-rule'`` key of `optax-options` in the `options` dictionary. See the docstring of the ``optax-custom`` for details.
- Adds ``check_intersection`` flag to ``desc.magnetic_fields.FourierCurrentPotentialField.to_Coilset``, to allow the choice of checking the resulting coilset for intersections or not.
- Changes the import paths for ``desc.external`` to require reference to the sub-modules.

Expand All @@ -32,6 +34,11 @@ Performance Improvements

- `ProximalProjection.grad` uses a single VJP on the objective instead of multiple JVP followed by a manual VJP. This should be more efficient for expensive objectives.

Deprecations

- ``sgd`` optimizer is deprecated in favor of ``optax-sgd``, and will be removed in a future release. To achieve the same behavior with `optimizer = Optimizer['sgd']` and `options={'alpha': ..., 'beta': ...}` when the optimizer is removed, one can use `optimizer = Optimizer['optax-sgd']` and `options={'optax-options': {'learning_rate': alpha, 'momentum': beta, 'nesterov': True}}`.


v0.16.0
-------

Expand Down
72 changes: 61 additions & 11 deletions desc/optimize/_desc_wrappers.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,41 @@
from .optimizer import register_optimizer
from .stochastic import sgd

# List of all optax optimizers to register
# You may use the following test to update the list accordingly
# https://github.com/PlasmaControl/DESC/pull/2041#issuecomment-3813092445
_all_optax_optimizers = [
"adabelief",
"adadelta",
"adafactor",
"adagrad",
"adam",
"adamax",
"adamaxw",
"adamw",
"adan",
"amsgrad",
"fromage",
"lamb",
"lars",
"lbfgs",
"lion",
"nadam",
"nadamw",
"noisy_sgd",
"novograd",
"optimistic_adam_v2",
"optimistic_gradient_descent",
"polyak_sgd",
"radam",
"rmsprop",
"rprop",
"sgd",
"sign_sgd",
"sm3",
"yogi",
]


@register_optimizer(
name=["fmin-auglag", "fmin-auglag-bfgs"],
Expand Down Expand Up @@ -376,9 +411,18 @@ def _optimize_desc_fmin_scalar(


@register_optimizer(
name="sgd",
description="Stochastic gradient descent with Nesterov momentum"
+ "See https://desc-docs.readthedocs.io/en/stable/_api/optimize/desc.optimize.sgd.html", # noqa: E501
name=["sgd", "optax-custom"] + ["optax-" + opt for opt in _all_optax_optimizers],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are we planning on deprecating sgd in favor of optax-sgd?

description=[
"Stochastic gradient descent with Nesterov momentum. See "
+ "https://desc-docs.readthedocs.io/en/stable/_api/optimize/desc.optimize.sgd.html", # noqa: E501
"Wrapper for custom ``optax`` optimizer. See "
+ "https://desc-docs.readthedocs.io/en/stable/_api/optimize/desc.optimize.sgd.html", # noqa: E501
]
+ [
f"``optax`` wrapper for {opt}. See "
+ f"https://optax.readthedocs.io/en/latest/api/optimizers.html#optax.{opt}" # noqa: E501
for opt in _all_optax_optimizers
],
scalar=True,
equality_constraints=False,
inequality_constraints=False,
Expand All @@ -400,15 +444,20 @@ def _optimize_desc_stochastic(
x0 : ndarray
Starting point.
method : str
Name of the method to use.
x_scale : array_like or ‘jac’, optional
Name of the method to use. Available options are `'sgd'`.
Additionally, ``optax`` optimizers can be used by specifying the method as
``'optax-<optimizer_name>'``, where ``<optimizer_name>`` is any valid ``optax``
optimizer. Hyperparameters for the ``optax`` optimizer must be passed via the
``'optax-options'`` key of ``options`` dictionary. A custom ``optax``
optimizer can be used by specifying the method as ``'optax-custom'`` and
passing the ``optax`` optimizer via the ``'update-rule'`` key of
``'optax-options'`` in the ``options`` dictionary.
x_scale : array_like or 'auto', optional
Characteristic scale of each variable. Setting x_scale is equivalent to
reformulating the problem in scaled variables xs = x / x_scale. An alternative
view is that the size of a trust region along jth dimension is proportional to
x_scale[j]. Improved convergence may be achieved by setting x_scale such that
a step of a given size along any of the scaled variables has a similar effect
on the cost function. If set to ‘jac’, the scale is iteratively updated using
the inverse norms of the columns of the Jacobian matrix.
reformulating the problem in scaled variables xs = x / x_scale. Improved
convergence may be achieved by setting x_scale such that a step of a given
size along any of the scaled variables has a similar effect on the cost
function. Defaults to 'auto', meaning no scaling.
verbose : int
* 0 : work silently.
* 1 : display a termination report.
Expand Down Expand Up @@ -438,6 +487,7 @@ def _optimize_desc_stochastic(
grad=objective.grad,
args=(objective.constants,),
method=method,
x_scale=x_scale,
ftol=stoptol["ftol"],
xtol=stoptol["xtol"],
gtol=stoptol["gtol"],
Expand Down
Loading
Loading