-
|
This is more of a general symbolic regression question. I am using PySR at the moment to fit data which is generated from a very large Fourier series that has some smoothness constraints. I am experimenting with gradient boosting and having PySR find simple functions (like maxsize = 9) at each boosting step and then using a custom framework to do joint optimization with simulated annealing on the joint models at each boosting step (eg f1, f1+f2, f1+f2+f3, etc for num_boost_steps). But what I find is that some of the simple models I find have vertical asymptotes and am wondering if someone has found a robust criterion that effectively weeds these solutions out in practice? The problem I find with penalizing the derivative or the max is that sr might just find a steeper asymptote that can't be detected by the mesh.. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
Heh so I realized that it is not a big deal because one can patch the singularity by modifying the functional form (see the second term in the example fit equation attached) |
Beta Was this translation helpful? Give feedback.

Heh so I realized that it is not a big deal because one can patch the singularity by modifying the functional form (see the second term in the example fit equation attached)