Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove preprocess and add scaling to FAQ #45

Merged
merged 1 commit into from
Oct 28, 2022
Merged

Remove preprocess and add scaling to FAQ #45

merged 1 commit into from
Oct 28, 2022

Conversation

danielward27
Copy link
Owner

Remove option for preprocess_bijection as it could potentially cause confusion. For example, a user may expect to be able to run train_flow twice in order to train for more epochs (e.g. rerunning a cell containing train_flow). However this would not be the case when using a preprocess_bijection, as the flow outputted from the first call to train_flow would already be transformed by Invert(preprocess_bijection), and the output from the second call would have the inverse transformation applied twice.

Instead, an example is added to the FAQ for scaling/preprocessing, which is more explicit.

@danielward27 danielward27 merged commit da65f86 into main Oct 28, 2022
@danielward27 danielward27 deleted the preprocess branch October 28, 2022 11:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant