-
Notifications
You must be signed in to change notification settings - Fork 129
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
augur export prefers num_date over numdate exported by augur refine #215
Comments
My 2¢:
|
This is related to #180 and duplicates that issues request to pick a standard name for numerical dates. This ticket adds the request of making augur export exit on name collisions. |
treetime uses |
Revisiting this issue after the augur v6 updates to how auspice JSONs are exported and the original issue with conflicting key names in metadata and node JSONs remains. The current implementation for merging metadata and node JSONs for augur v2 export explicitly overwrites the keys in the metadata. This can result in silently overwriting data that the user would expect to pass through to the final auspice JSONs. Instead of overwriting overlapping keys, |
@huddlej Do I understand well that Raising an exception here means that the the export will fail since the function won't return. Maybe just printing a warning in the parse function will be enough? |
The augur refine command exports branch length information including a
numdate
field that provides a likely estimate for each node's date.The augur export command checks for both
numdate
andnum_date
fields in any "node data" JSONs or the metadata TSV and preferentially selects thenum_date
information.In the case where the metadata TSV has a
num_date
field that was annotated by a previous step in the pipeline, the maximum likelihoodnumdate
value from augur refine is omitted in favor of the metadata's value.There are a couple issues to figure out:
Even if the solution to the second part is to print an error and exit loudly, that would allow users to track down the issue with their data.
The text was updated successfully, but these errors were encountered: