Skip to content

Update PINNInterface Inheritance #542

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 14, 2025
Merged

Update PINNInterface Inheritance #542

merged 3 commits into from
Apr 14, 2025

Conversation

dario-coscia
Copy link
Collaborator

I propose these two changes:

  1. Move add_param_group of InverseProblem parameters from PINNInterface to SingleSolverInterface since also supervised single solver could in principle solve inverse problems. For multisolver, it is not clear which optimizer needs the parameter to be attached (case specific)
  2. PINNInterface should inherit from SupervisedSolverInterface to avoid code duplication

@dario-coscia dario-coscia added the pr-to-review Label for PR that are ready to been reviewed label Apr 10, 2025
@dario-coscia dario-coscia self-assigned this Apr 10, 2025
Copy link
Contributor

github-actions bot commented Apr 10, 2025

badge

Code Coverage Summary

Filename                                                   Stmts    Miss  Cover    Missing
-------------------------------------------------------  -------  ------  -------  ---------------------------------------------------------------------------------------------------------------------
__init__.py                                                    7       0  100.00%
collector.py                                                  40       1  97.50%   46
graph.py                                                     114      11  90.35%   99-100, 112, 124, 126, 142, 144, 166, 169, 182, 271
label_tensor.py                                              248      32  87.10%   81, 121, 144-148, 165, 177, 182, 188-193, 273, 280, 332, 334, 348, 444-447, 490, 548, 632, 652-654, 667-676, 691, 713
operator.py                                                   68       5  92.65%   250-268, 457
operators.py                                                   6       6  0.00%    3-12
plotter.py                                                     1       1  0.00%    3
trainer.py                                                    75       6  92.00%   195-204, 293, 314, 318, 322
utils.py                                                      56       8  85.71%   113, 150, 153, 156, 192-195
adaptive_function/__init__.py                                  3       0  100.00%
adaptive_function/adaptive_function.py                        55       0  100.00%
adaptive_function/adaptive_function_interface.py              51       6  88.24%   98, 141, 148-151
adaptive_functions/__init__.py                                 6       6  0.00%    3-12
callback/__init__.py                                           5       0  100.00%
callback/adaptive_refinement_callback.py                       8       1  87.50%   37
callback/linear_weight_update_callback.py                     28       1  96.43%   63
callback/optimizer_callback.py                                22       1  95.45%   34
callback/processing_callback.py                               49       5  89.80%   42-43, 73, 168, 171
callbacks/__init__.py                                          6       6  0.00%    3-12
condition/__init__.py                                          7       0  100.00%
condition/condition.py                                        35       8  77.14%   23, 127-128, 131-132, 135-136, 151
condition/condition_interface.py                              32       3  90.62%   31, 76, 100
condition/data_condition.py                                   26       1  96.15%   56
condition/domain_equation_condition.py                        19       0  100.00%
condition/input_equation_condition.py                         44       1  97.73%   129
condition/input_target_condition.py                           44       1  97.73%   125
data/__init__.py                                               3       0  100.00%
data/data_module.py                                          204      22  89.22%   42-53, 133, 173, 194, 233, 314-318, 324-328, 400, 467, 547, 638, 640
data/dataset.py                                               80       6  92.50%   42, 123-126, 291
domain/__init__.py                                            10       0  100.00%
domain/cartesian.py                                          112      10  91.07%   37, 47, 75-76, 92, 97, 103, 246, 256, 264
domain/difference_domain.py                                   25       2  92.00%   54, 87
domain/domain_interface.py                                    20       5  75.00%   37-41
domain/ellipsoid.py                                          104      24  76.92%   52, 56, 127, 250-257, 269-282, 286-287, 290, 295
domain/exclusion_domain.py                                    28       1  96.43%   86
domain/intersection_domain.py                                 28       1  96.43%   85
domain/operation_interface.py                                 26       1  96.15%   88
domain/simplex.py                                             72      14  80.56%   62, 207-225, 246-247, 251, 256
domain/union_domain.py                                        25       2  92.00%   43, 114
equation/__init__.py                                           4       0  100.00%
equation/equation.py                                          11       0  100.00%
equation/equation_factory.py                                  24      10  58.33%   37, 62-75, 97-110, 132-145
equation/equation_interface.py                                 4       0  100.00%
equation/system_equation.py                                   22       0  100.00%
geometry/__init__.py                                           7       7  0.00%    3-15
loss/__init__.py                                               7       0  100.00%
loss/loss_interface.py                                        17       2  88.24%   45, 51
loss/lp_loss.py                                               15       0  100.00%
loss/ntk_weighting.py                                         26       0  100.00%
loss/power_loss.py                                            15       0  100.00%
loss/scalar_weighting.py                                      16       0  100.00%
loss/weighting_interface.py                                    6       0  100.00%
model/__init__.py                                             10       0  100.00%
model/average_neural_operator.py                              31       2  93.55%   73, 82
model/deeponet.py                                             93      13  86.02%   187-190, 209, 240, 283, 293, 303, 313, 323, 333, 488, 498
model/feed_forward.py                                         89      11  87.64%   58, 195, 200, 278-292
model/fourier_neural_operator.py                              78      10  87.18%   96-100, 110, 155-159, 218, 220, 242, 342
model/graph_neural_operator.py                                40       2  95.00%   58, 60
model/kernel_neural_operator.py                               34       6  82.35%   83-84, 103-104, 123-124
model/low_rank_neural_operator.py                             27       2  92.59%   89, 98
model/multi_feed_forward.py                                   12       5  58.33%   25-31
model/spline.py                                               89      37  58.43%   30, 41-66, 69, 128-132, 135, 159-177, 180
model/block/__init__.py                                       12       0  100.00%
model/block/average_neural_operator_block.py                  12       0  100.00%
model/block/convolution.py                                    64      13  79.69%   77, 81, 85, 91, 97, 111, 114, 151, 161, 171, 181, 191, 201
model/block/convolution_2d.py                                146      27  81.51%   155, 162, 282, 314, 379-433, 456
model/block/embedding.py                                      48       7  85.42%   93, 143-146, 155, 168
model/block/fourier_block.py                                  31       0  100.00%
model/block/gno_block.py                                      22       4  81.82%   73-77, 87
model/block/integral.py                                       18       4  77.78%   22-25, 71
model/block/low_rank_block.py                                 24       0  100.00%
model/block/orthogonal.py                                     37       0  100.00%
model/block/pod_block.py                                      65       9  86.15%   54-57, 69, 99, 134-139, 170, 195
model/block/rbf_block.py                                     179      25  86.03%   18, 42, 53, 64, 75, 86, 97, 223, 280, 282, 298, 301, 329, 335, 363, 367, 511-524
model/block/residual.py                                       46       0  100.00%
model/block/spectral.py                                       83       4  95.18%   132, 140, 262, 270
model/block/stride.py                                         28       7  75.00%   55, 58, 61, 67, 72-74
model/block/utils_convolution.py                              22       3  86.36%   58-60
model/layers/__init__.py                                       6       6  0.00%    3-12
optim/__init__.py                                              5       0  100.00%
optim/optimizer_interface.py                                   7       0  100.00%
optim/scheduler_interface.py                                   7       0  100.00%
optim/torch_optimizer.py                                      14       0  100.00%
optim/torch_scheduler.py                                      19       2  89.47%   5-6
problem/__init__.py                                            6       0  100.00%
problem/abstract_problem.py                                  104      14  86.54%   52, 61, 101-106, 135, 147, 165, 239, 243, 272
problem/inverse_problem.py                                    22       0  100.00%
problem/parametric_problem.py                                  8       1  87.50%   29
problem/spatial_problem.py                                     8       0  100.00%
problem/time_dependent_problem.py                              8       0  100.00%
problem/zoo/__init__.py                                        8       0  100.00%
problem/zoo/advection.py                                      33       7  78.79%   36-38, 52, 108-110
problem/zoo/allen_cahn.py                                     20       6  70.00%   20-22, 34-36
problem/zoo/diffusion_reaction.py                             29       5  82.76%   94-104
problem/zoo/helmholtz.py                                      30       6  80.00%   36-42, 103-107
problem/zoo/inverse_poisson_2d_square.py                      31       0  100.00%
problem/zoo/poisson_2d_square.py                              19       3  84.21%   65-70
problem/zoo/supervised_problem.py                             11       0  100.00%
solver/__init__.py                                             6       0  100.00%
solver/garom.py                                              107       2  98.13%   129-130
solver/solver.py                                             188      10  94.68%   192, 215, 287, 290-291, 350, 432, 515, 556, 562
solver/ensemble_solver/__init__.py                             4       0  100.00%
solver/ensemble_solver/ensemble_pinn.py                       23       1  95.65%   104
solver/ensemble_solver/ensemble_solver_interface.py           27       0  100.00%
solver/ensemble_solver/ensemble_supervised.py                  9       0  100.00%
solver/physics_informed_solver/__init__.py                     8       0  100.00%
solver/physics_informed_solver/causal_pinn.py                 47       3  93.62%   157, 166-167
solver/physics_informed_solver/competitive_pinn.py            58       0  100.00%
solver/physics_informed_solver/gradient_pinn.py               17       0  100.00%
solver/physics_informed_solver/pinn.py                        18       0  100.00%
solver/physics_informed_solver/pinn_interface.py              47       1  97.87%   130
solver/physics_informed_solver/rba_pinn.py                    35       3  91.43%   155-158
solver/physics_informed_solver/self_adaptive_pinn.py          90       3  96.67%   315-318
solver/supervised_solver/__init__.py                           4       0  100.00%
solver/supervised_solver/reduced_order_model.py               24       1  95.83%   137
solver/supervised_solver/supervised.py                         7       0  100.00%
solver/supervised_solver/supervised_solver_interface.py       25       0  100.00%
solvers/__init__.py                                            6       6  0.00%    3-12
solvers/pinns/__init__.py                                      6       6  0.00%    3-12
TOTAL                                                       4457     494  88.92%

Results for commit: 64bf7bd

Minimum allowed coverage is 80.123%

♻️ This comment has been updated with latest results

Copy link
Collaborator

@GiovanniCanali GiovanniCanali left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @dario-coscia,

I am not comfortable with PINNInterface inheriting from SupervisedSolverInterface, as I find it conceptually misleading: PINNs can be entirely "unsupervised". On the other hand, your implementation seems to be logically sound and streamlines the code, even if the reduction is relatively modest (~50 lines).

I would suggest waiting for additional opinions from others on this point.

As for the changes related to InverseProblem, I have no objections. It would be nice to do the same thing for Multi solvers, but I reckon it might be a bit trickier.

@dario-coscia
Copy link
Collaborator Author

Hi @dario-coscia,

I am not comfortable with PINNInterface inheriting from SupervisedSolverInterface, as I find it conceptually misleading: PINNs can be entirely "unsupervised". On the other hand, your implementation seems to be logically sound and streamlines the code, even if the reduction is relatively modest (~50 lines).

I would suggest waiting for additional opinions from others on this point.

As for the changes related to InverseProblem, I have no objections. It would be nice to do the same thing for Multi solvers, but I reckon it might be a bit trickier.

I see your point, and that's also a concern that I share. Inheriting from SupervisedSolver is done in order to have less lines of code (and don't repeat loss_data method). In my opinion, every method present in Supervised solver interface should be inherited by pinn as they can also work in a supervised manner. The problem is more the fact that loss_data is abstract, which means that the user must implement it if inheriting from pinn interface
, but of course sometimes it is not needed. Not making it abstract in supervised solver interface, on the other hand, would not make sense.

A possible solution would be to overload loss_data in pinn interface and make it not abstract (so it is up to the user if the method must be defined). What do you think?

@dario-coscia
Copy link
Collaborator Author

@FilippoOlivo what do you think?

@FilippoOlivo
Copy link
Member

Hi @dario-coscia @GiovanniCanali,
Thank for the PR. Personally, the inheritance of SupervisedSolver in PINNInterface is quite straightforward since a PINN could be trained also in a supervised fashion (tell me if I am wrong, please). However, share the concern about the mandatory of overloading loss_data every time even the if the model does not involve "Supervised data". For this reason, I suggest to overload the loss_data method with a simply function which raises a NotImplementedError. In this case, is up to the user to whether to override the function or not, based on the type of training.

@dario-coscia
Copy link
Collaborator Author

Hi @dario-coscia @GiovanniCanali, Thank for the PR. Personally, the inheritance of SupervisedSolver in PINNInterface is quite straightforward since a PINN could be trained also in a supervised fashion (tell me if I am wrong, please). However, share the concern about the mandatory of overloading loss_data every time even the if the model does not involve "Supervised data". For this reason, I suggest to overload the loss_data method with a simply function which raises a NotImplementedError. In this case, is up to the user to whether to override the function or not, based on the type of training.

We can think to add @typing.override decorator (see here) on loss_data and do what @FilippoOlivo suggested! What do you think @GiovanniCanali ?

@GiovanniCanali
Copy link
Collaborator

Hi @dario-coscia @GiovanniCanali, Thank for the PR. Personally, the inheritance of SupervisedSolver in PINNInterface is quite straightforward since a PINN could be trained also in a supervised fashion (tell me if I am wrong, please). However, share the concern about the mandatory of overloading loss_data every time even the if the model does not involve "Supervised data". For this reason, I suggest to overload the loss_data method with a simply function which raises a NotImplementedError. In this case, is up to the user to whether to override the function or not, based on the type of training.

We can think to add @typing.override decorator (see here) on loss_data and do what @FilippoOlivo suggested! What do you think @GiovanniCanali ?

I am sorry, but I am not sure I have fully understood what you mean: are you still planning to let PINNInterface inherit from SupervisedSolverInterface? If yes, could you clarify the purpose of using the @typing.override decorator in that case? If not, could you explain how this approach differs from the current implementation in the package?

@dario-coscia
Copy link
Collaborator Author

Hi @dario-coscia @GiovanniCanali, Thank for the PR. Personally, the inheritance of SupervisedSolver in PINNInterface is quite straightforward since a PINN could be trained also in a supervised fashion (tell me if I am wrong, please). However, share the concern about the mandatory of overloading loss_data every time even the if the model does not involve "Supervised data". For this reason, I suggest to overload the loss_data method with a simply function which raises a NotImplementedError. In this case, is up to the user to whether to override the function or not, based on the type of training.

We can think to add @typing.override decorator (see here) on loss_data and do what @FilippoOlivo suggested! What do you think @GiovanniCanali ?

I am sorry, but I am not sure I have fully understood what you mean: are you still planning to let PINNInterface inherit from SupervisedSolverInterface? If yes, could you clarify the purpose of using the @typing.override decorator in that case? If not, could you explain how this approach differs from the current implementation in the package?

Currently, PINNInterface inherits from SolverInterface, meaning that we must redefine methods (loss_data) and attributes loss. This leads to bugs. Suppose we don't want to have loss_data anymore in SupervisedSolverInterface but call it pippo, then we should remember to change this also inPINNInterface.

When inheriting from SupervisedSolverInterface we do not have anymore the problem of redefining methods and attributes, but we have the problem that loss_data is abstract, meaning that if a user builds a new solver he/she has to write the loss_data function. But maybe hs/she does not want it. We can easily fix this by defining a loss_data function in PINNInterface that returns a not implemented error. The problem with this is that if in the future again we change in SupervisedSolverInterface the attribute loss_data this is not automatically flagged as error in PINNInterface . This is why @override decorator is used, it indicates that loss_data in PINNInterface is intended to override a method or attribute in a superclass (SupervisedSolverInterface).

@FilippoOlivo
Copy link
Member

Agre

Hi @dario-coscia @GiovanniCanali, Thank for the PR. Personally, the inheritance of SupervisedSolver in PINNInterface is quite straightforward since a PINN could be trained also in a supervised fashion (tell me if I am wrong, please). However, share the concern about the mandatory of overloading loss_data every time even the if the model does not involve "Supervised data". For this reason, I suggest to overload the loss_data method with a simply function which raises a NotImplementedError. In this case, is up to the user to whether to override the function or not, based on the type of training.

We can think to add @typing.override decorator (see here) on loss_data and do what @FilippoOlivo suggested! What do you think @GiovanniCanali ?

Fine for me!

@dario-coscia
Copy link
Collaborator Author

Thank you @GiovanniCanali and @FilippoOlivo, for the nice discussion. Unfortunately, the @override decorator was introduced in py3.9; therefore, we cannot use it at the moment. If we drop py3.8 in future releases (see #529) we can come back to this. At the moment, I put a NotImplementedError.

@dario-coscia
Copy link
Collaborator Author

@FilippoOlivo is now ok for you? Can we merge?

@FilippoOlivo FilippoOlivo merged commit a84b105 into dev Apr 14, 2025
19 checks passed
@FilippoOlivo FilippoOlivo deleted the update_pinn branch April 14, 2025 07:45
dario-coscia added a commit that referenced this pull request Apr 14, 2025
FilippoOlivo pushed a commit to FilippoOlivo/PINA that referenced this pull request Apr 17, 2025
dario-coscia added a commit that referenced this pull request Apr 17, 2025
dario-coscia added a commit that referenced this pull request Apr 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr-to-review Label for PR that are ready to been reviewed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants