-
Notifications
You must be signed in to change notification settings - Fork 11
Description
I have a pre-trained resnet-50 network and I would like to experiment with different amounts of compression via this discrete -> continuous -> resample process.
What I am noticing is that it takes quite some time to construct the integral wrapper. From there it appears that resizing the model groups can only be done once. After that, any calls to resize groups fail silently and have no impact on the model that is export via get_unparameterized_model().
What this means is that each time I wish to experiment with a different amount of compression, I must reconstruct the IntegralWrapper, which is super time consuming. I've tried to pickle the wrapped model and also deepcopy it so that I can avoid this cost, and neither of them work due to recursion limit issues. Increasing the recursion limit simply crashes the kernel.
So this issue is essentially three fundamental questions:
- How can resize() be made to work if called multiple times without reconstructing the wrapped model each time?
- Can the wrapped model be pickled or saved such that we can skip the time-consuming process of converting/rearranging/permuting if it has already been done once?
- Is there a different or better way to experiment with different compression amounts without reconstructing the wrapped model each time?