-
Notifications
You must be signed in to change notification settings - Fork 386
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use cell tags instead of cell meta-data to mark "frozen"/demo cells #226
Comments
Hello @hoangthienan95 , Good to know that you enjoy Jupytext - Thanks for your feedback! Regarding the implementation, what you're asking for would be very easy to do. You would just need to insert a new condition in the is_active function. The condition could look like this:
Obviously we'd need a better tag name than 'not-active-in-scripts'! Do you have a prefered tag name for this? |
Hi @mwouts, thanks so much for the swift reply! I'm mostly putting extra cells below my complicated functions to provide a demo of how the function works, what the returned data/dataframe schema looks like after calling the function, and to explain design decisions that I have made (to avoid an edge case with no other simple way around for example). It's like docstring, but executable if you have the I'm inclined to call the tag To piggy-back on that, how do I make what's commented out in these cells a docstring |
I just thought about maybe other people want the cells to be active in specific extensions and not others, then maybe they can add their own "active-[extension]" or "inactive-[extension]" tags and write something quick to parse that themselves |
I like the "active-[extension]" proposal, as it ressembles what we have with the "active" metadata. So "active-ipynb" would mean active only in "ipynb" format, and "active-ipynb-md" would mean active only in ipynb and md extensions.
Well, if you want to collaborate with other users I would recommand that they all use the same convention! So I'd rather try to define a good convention here...
Sorry there's no way to do exactly that. Still, if you don't want dead code, you could put it under an |
Hello @hoangthienan95 , in the new release (version 1.1.2) you will be able to mark cells as active in ipynb only ( |
Wow thanks so much @mwouts, really appreciate you help and thanks for everything. Keep up the good work! |
You're welcome! I'm glad this helps. By the way, you mention that you are also using |
@mwouts If you mean that I could do Use case: after developing scripts interactively on Jupyter Notebook, I'd most likely want to run multiple instances of it on HPC in parallel. To do so, I would have to write alot of I didn't think this was possible, was just being wishful. Is this theoretically possible? I can imagine it's alot of work and a bit unreasonable as a feature request for jupytext. |
also I wonder how the text notebook would deal with the data stored (if any) from scrapbook, a package usually used with papermill to store data in notebook and later read it back out |
On second thought, this might be easier than I imagined. Basically, it would be running papermill with |
Thanks @hoangthienan95 for sharing your use case! Very interesting. I am sure we can do something about this... I'll keep you posted! |
@mwouts I also opened an issue at papermill repo since the
Let me know what you think! |
Agreed! We won't implement anything big in Jupytext. At maximum we would use papermill and/or nbconvert internally to execute the notebook. And the minimum would be to have some documentation on this, and a few tests to make sure what we recommand does work...
Don't you think you could simply pipe the notebook ? Jupytext, nbconvert, and also Python I expect, can take notebooks on stdin/stdout. Maybe we could ask papermill to do that as well? I like piping as it removes the requirement to name the notebook, especially when we have varying parameters.
Oh, that should be possible already. Can you give a try to: jupytext notebook.ipynb --to py -o - --update-metadata '{"jupytext":{"executable":"/usr/bin/env python"}}' |
Hi there,
Thank you for the great tool. I'm new to it but loving it so far. I was just wondering where I should go to change the behavior of how jupytext identifies the cells to comment out when I want to import notebook as module.
I have many cells that I want to keep in
.ipynb
but commented out in.py
as demo/working example of how the module works. However, to my knowledge, there is no extensions for jupyter lab to edit the metadata of multiple cells at once ( only cell tags), and it's hard to select all the cells that have the same metadata with it to get an overall view of what will be commented out. The suggested extensionfreeze
doesn't have an equivalent in jupyter lab.If this is the case. Could you point me to where I need to change for jupytext to detect a cell tag instead of the "active" keyword in the metadata? I have to tag cells for papermill and having all the metadata in one category
tags
would be extremely helpful. Are there any potential problems with doing this?The text was updated successfully, but these errors were encountered: