Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Come up with a consistent analysis workflow #25

Open
s-weigand opened this issue Jun 15, 2021 · 5 comments
Open

✨ Come up with a consistent analysis workflow #25

s-weigand opened this issue Jun 15, 2021 · 5 comments
Assignees
Labels
enhancement New feature or request

Comments

@s-weigand
Copy link
Member

Due to the fact that this whole repo is naturally grown as a debugging tool with real-world examples for pyglotaran, we have an inconsistent style, which makes it more difficult for new users to spot differences in the actual analysis itself.
Thus we should come up with a kind of recommended workflow, which looks and feels the same across case studies.

@s-weigand s-weigand added the enhancement New feature or request label Jun 15, 2021
@s-weigand
Copy link
Member Author

Suggestion for a case study workflow structure in a notebook:

Workflow for mostly known systems

  1. Default imports from pyglotaran + extras
  2. Folder setup with setup_case_study
  3. Create dataset dict
    For this a convenience function in the extras would be nice
    The usage could look something like this
>>> datasets = load_datasets(root_folder=script_folder / "data", file_names=["data1.ascii", "data2.nc"])
{"dataset1": <xr.Dataset>, "dataset2": <xr.Dataset>}
>>> datasets = load_datasets(root_folder=script_folder / "data", file_names={"lab1": "data1.ascii", "lab2":"data2.nc"})
{"lab1": <xr.Dataset>, "lab2": <xr.Dataset>}
  1. Inspect loaded data (manually calling plot functions or widget)
  2. Display model and parameters (self-contained document)
  3. Load model + parameters and validate them (maybe scheme.validate would be more useful since it contains all information?)
  4. Optimize scheme and display result (datasets, plots, optimized_parameters)

Additional Guides and tooling

Unknow system

Debugging

  • Render matrix shortcut
    Currently this is a lot of manual work (example from this notebook)
compartments = PAL_closed_target_model.initial_concentration["input1"].compartments

PAL_closed_target_model.megacomplex["mc1"].full_k_matrix(
    PAL_closed_target_model
).matrix_as_markdown(compartments).replace("0.0000e+00", "")

@s-weigand
Copy link
Member Author

Additionally, parameters (e.g. compartments) should use rich expressive names.

@s-weigand
Copy link
Member Author

  • Instead of Workflow for mostly known systems let's call it Showing a finished analysis or something along that line.
  • Optimization "peaking" could be named try_model, since peaking is too ambiguous in this context with actual peak forms like Gaussians

@joernweissenborn
Copy link
Member

So with my planned project feature, I imagine the following (enviroment agnostic) workflow:

  1. Create or open a project.
  2. Import data into the project (this will copy that data, convert to nc and add svd and things).
  3. interact with the data via the Project instance.
  4. Create a model base with the project API which leverage model generation.
  5. If necessary, adapt modelfile
  6. create a parameter file with project API.
  7. Tune your parameters
  8. Run an optimization
  9. Tweak model
  10. create new parameters from previous run with project API
  11. tweak parameter
  12. Run optimization
  13. repeat from step 9

About debugging in e.g. notebooks I have no opinion, I personally prefer to use vim for model/parameters, CLI for optimization and notebooks only for plotting, but good that we are agnostic.

@s-weigand
Copy link
Member Author

@joernweissenborn Since we don't have the project feature yet and this would a typical workflow when creating a new analysis/refine an existing one. I would categorize this as Additional Guides and tooling.

IMHO to wow new users it would be nice to have a gallery-like section where we present completed analyses and when the users think "cool stuff I want to do that too!" we have guides on how to create your own.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants