You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m trying to implement a custom loss function in Julia string to use with PySR, and I’m encountering some issues. Here’s what I’m aiming to achieve
I want to constrain the distribution H^{u-d} (X,0,t) with a specific moment formula at several values of t Specifically, I have a target formula like this int H^{u-d}(X, 0, t) dx = A_{10}^{u-d} (t) where t takes on around 10 specific values, and I want to calculate the integral of H^{u-d} (X,0,t).
The custom loss should compute the difference between this integral and the target values A_{10}^{u-d} (t) at each
t value. I plan to sum up these differences (or take an average) to create the final loss.
I attempted to implement this in Julia by evaluating H^{u-d} (X,0,t) (assumed to be represented by PySR’s predictions) and then using quadgk for numerical integration over X from 0 to 1.
I ran into issues with the function signature and type mismatches. PySR seems to pass individual values (like Float32) instead of the expected Dataset structure, which caused errors.
JuliaError: MethodError: no method matching CustomLossWithIntegralConstraint(::Float32, ::Float32)
Closest candidates are:
CustomLossWithIntegralConstraint(::Any, !Matched::Dataset{T, L, AX, AY, AW} where {AX<:AbstractMatrix{T}, AY<:Union{Nothing, AbstractVector{T}}, AW<:Union{Nothing, AbstractVector{T}}}, !Matched::Any, !Matched::Any) where {T, L}
@ Main none:5
this is the first method
function CustomLossWithIntegralConstraint(tree, dataset::Dataset{T,L}, options, idx)::L where {T,L}
# Evaluate the symbolic expression (tree) on the dataset to get predictions
predictions, success = eval_tree_array(tree, dataset.X, options)
if !success
return Inf
end
# Define H^{u-d}(X, 0, t) as a function that depends on `X`
H_ud(X) = predictions[1] # Assuming predictions[1] is H^{u-d}(X, 0, t)
# Define the target values from the table for each t value
A10_values = [
0.851, 0.702, 0.607, 0.573, 0.487, 0.359, 0.396, 0.376,
0.320, 0.266, 0.214
]
# Initialize the total loss
total_loss = 0.0
# Iterate over each target value, representing each `t` in the table
for target_value in A10_values
# Compute the integral of H^{u-d}(X, 0, t) over X from 0 to 1
integral_result = quadgk(H_ud, 0, 1)[1]
# Calculate the loss as the absolute difference between the integral and the target value
total_loss += abs(integral_result - target_value)
end
I have tried another method
individual_differences = []
for (i, t_i) in enumerate(t_values)
# Set the number of grid points dynamically based on input data size
n_grid_points = length(dataset.X[:, 1]) # Match the number of x-values in your dataset
# Create a dense grid of x values from 0 to 1
dense_x = range(0.0, stop=1.0, length=n_grid_points) |> collect
dense_X_matrix = reshape(dense_x, :, 1)
# Construct a dataset where t_i is fixed for each dense_x (input: log(x), log(1-x), log(t))
dense_X_matrix_t = hcat(log.(dense_x), log.(1 .- dense_x), fill(log(t_i), length(dense_x)))
# Evaluate the model on this dense grid
predictions_dense, flag_dense = eval_tree_array(tree, dense_X_matrix_t, options)
if !flag_dense
return Inf
end
# Formula 1: F_{1}(t) (experimental data instead of A_{10}^{u-d}(t))
integral_F1 = mean(predictions_dense) # Integral from 0 to 1 is the average
F1_target = F1_data[i]
push!(individual_differences, (integral_F1 - F1_target)^2)
by taking the mean for the formula to calculate the integral from 0 to 1 ,but every time I get the same integral predictions for all t values.
I would also like to know how to print individual_differences or any losses from the julia to plot the integrals.
I am suppose to do this integral for five more formulas and add them up to the total loss actually
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello @MilesCranmer ,
I’m trying to implement a custom loss function in Julia string to use with PySR, and I’m encountering some issues. Here’s what I’m aiming to achieve
I want to constrain the distribution H^{u-d} (X,0,t) with a specific moment formula at several values of t Specifically, I have a target formula like this int H^{u-d}(X, 0, t) dx = A_{10}^{u-d} (t) where t takes on around 10 specific values, and I want to calculate the integral of H^{u-d} (X,0,t).
The custom loss should compute the difference between this integral and the target values A_{10}^{u-d} (t) at each
t value. I plan to sum up these differences (or take an average) to create the final loss.
I attempted to implement this in Julia by evaluating H^{u-d} (X,0,t) (assumed to be represented by PySR’s predictions) and then using quadgk for numerical integration over X from 0 to 1.
I ran into issues with the function signature and type mismatches. PySR seems to pass individual values (like Float32) instead of the expected Dataset structure, which caused errors.
this is the first method
I have tried another method
by taking the mean for the formula to calculate the integral from 0 to 1 ,but every time I get the same integral predictions for all t values.
I would also like to know how to print individual_differences or any losses from the julia to plot the integrals.
I am suppose to do this integral for five more formulas and add them up to the total loss actually
Beta Was this translation helpful? Give feedback.
All reactions