Skip to content

Commit

Permalink
Adding one solution that produced a contact discontinuity
Browse files Browse the repository at this point in the history
  • Loading branch information
jloveric committed Feb 5, 2023
1 parent 0d36e21 commit ca6fbd5
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 2 deletions.
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,10 @@ with polynomial refinement
```
python examples/high_order_euler.py mlp.hidden.width=20 max_epochs=10000 mlp.segments=4 mlp.n=3 mlp.hidden.layers=8 factor=0.025 mlp.layer_type=continuous optimizer.patience=200 mlp.input.segments=20 batch_size=2048 form=primitive loss_weight.discontinuity=0.0 loss_weight.interior=1.0e-1 optimizer=adamw mlp.normalize=True mlp.rotations=6 gradient_clip=5.0 loss_weight.boundary=10 loss_weight.initial=10 data_size=10000 mlp.resnet=False refinement.type=p_refine refinement.epochs=1000
```
this one actually got the contact discontinuity, but the velocity is wrong
```
python examples/high_order_euler.py mlp.hidden.width=10 max_epochs=10000 mlp.segments=2 mlp.n=3 mlp.hidden.layers=8 factor=0.025 mlp.layer_type=continuous optimizer.patience=200 mlp.input.segments=20 batch_size=2048 form=primitive loss_weight.discontinuity=0.0 loss_weight.interior=1.0e-1 optimizer=adamw mlp.normalize=True mlp.rotations=4 gradient_clip=5.0e-1 loss_weight.boundary=10 loss_weight.initial=10 data_size=10000 mlp.resnet=False refinement.type=p_refine refinement.epochs=1000
```
## Training
High order MLP
```
Expand Down
13 changes: 11 additions & 2 deletions examples/high_order_euler.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,11 @@ def run(cfg: DictConfig):

# diff = cfg.mlp.target_n - cfg.mlp.n
model = Net(cfg)

cfg.mlp.n = cfg.refinement.start_n
n = cfg.mlp.n
cfg.mlp.n_in = n
cfg.mlp.n_out = n
cfg.mlp.n_hidden = n
for order in range(cfg.refinement.start_n, cfg.refinement.target_n):
trainer = Trainer(
max_epochs=cfg.refinement.epochs,
Expand All @@ -54,7 +58,12 @@ def run(cfg: DictConfig):
print(f"Training order {order}")
trainer.fit(model)
# trainer.test(model)
cfg.mlp.n = order + 1
n = order + 1
cfg.mlp.n = n
cfg.mlp.n_in = n
cfg.mlp.n_out = n
cfg.mlp.n_hidden = n

next_model = Net(cfg)

interpolate_high_order_mlp(network_in=model, network_out=next_model)
Expand Down

0 comments on commit ca6fbd5

Please sign in to comment.