A Python implementation of the Ocean Wave Optimizer (OWO), a nature-inspired metaheuristic optimization algorithm based on the dynamics of dominant waves in ocean wave groups.
The Ocean Wave Optimizer mimics the physical behavior of ocean waves to solve continuous optimization problems. The algorithm balances exploration (wave interference and random perturbations) with exploitation (waves following the dominant wave pattern) to efficiently search the solution space.
- Dynamic exploration-exploitation balance: Automatically adjusts the ratio based on iteration progress
- Stagnation detection: Detects when the algorithm is stuck and resets exploration parameters
- Dominant wave tracking: Maintains the best solution (dominant wave) found during optimization
- Flexible boundary handling: Supports different bounds for each dimension
- Convergence tracking: Records fitness values across iterations for analysis
- Execution time monitoring: Tracks start time, end time, and total execution duration
pip install numpygit clone https://github.com/SayedKenawy/ocean-wave-optimizer.git
cd ocean-wave-optimizerfrom OWO import OWO
import numpy as np
# Define your objective function (minimization)
def sphere_function(x):
return np.sum(x**2)
# Set parameters
dim = 10 # Problem dimensions
SearchAgents_no = 30 # Population size (number of waves)
Max_iter = 500 # Maximum iterations
lb = -100 # Lower bound
ub = 100 # Upper bound
# Run optimizer
solution = OWO(sphere_function, lb, ub, dim, SearchAgents_no, Max_iter)
# Access results
print(f"Best fitness: {solution.convergence[-1]}")
print(f"Execution time: {solution.executionTime:.4f} seconds")
print(f"Convergence curve: {solution.convergence}")| Parameter | Type | Description |
|---|---|---|
objf |
function | Objective function to minimize |
lb |
float or list | Lower bound(s) for search space |
ub |
float or list | Upper bound(s) for search space |
dim |
int | Number of dimensions in the problem |
SearchAgents_no |
int | Population size (number of waves) |
Max_iter |
int | Maximum number of iterations |
The initialization equation defines how each search agent (wave) is uniformly distributed within the bounded search space. This guarantees unbiased sampling and adequate coverage at the start of the optimization process.
Each wave’s position is evaluated using the objective function, transforming a multidimensional solution into a scalar fitness value.
The dominant wave represents the best solution found so far and guides the population movement.
Global exploration is achieved by introducing stochastic perturbations that allow waves to explore unexplored regions of the search space.
To prevent stagnation, complete random redistribution may also be applied.
Local exploitation pulls waves toward the dominant solution, refining candidate solutions around promising regions.
A nonlinear reflection strategy enhances fine-grained local adjustments.
The parameter controlling exploration and exploitation decays nonlinearly to ensure smooth convergence.
The best fitness value at each iteration is monitored to evaluate convergence behavior.
OWO can be applied to various optimization problems including:
- Hyperparameter tuning for machine learning models
- Feature selection
- Engineering design optimization
- Resource allocation problems
- Function approximation
- Neural network training
- Start with
SearchAgents_no = 30-50for most problems - Increase iterations for complex, high-dimensional problems
- Use appropriate bounds based on problem domain knowledge
- Monitor convergence curves to assess performance
MIT License - feel free to use and modify for your research and applications.