Conversation
|
|
||
| 8. This concludes the installation! Continue reading the [Running SHARPy](#running-sharpy) section. | ||
|
|
||
| ### Step 2 (PyPI route): Obtain SHARPy from PyPI (experimental!) |
There was a problem hiding this comment.
We may want to add some of the disclaimers here about packages which need to be installed prior, as well as it being prebuilt @kccwing ?
There was a problem hiding this comment.
I realise this PR has been sitting for a while... do we want to make any changes about the PyPi side of things @kccwing or are we good to merge?
There was a problem hiding this comment.
I think @wong-hl is working on the matrix build and hence the tvtk -> vtk to make it happen? Sorry I haven't been keeping up - glad to discuss the next steps.
There was a problem hiding this comment.
The matrix build will need a bit of work, as the packaging process needs to be restructured such that each new OS and CPU combination is actually saved as a different wheel. They currently end up in the same wheel and overwrite each other.
The tvtk -> vtk is a separate thing. I think @ben-l-p is also working on it??? I've paused working on it because he mentioned he was doing some stuff on that front and the test suite doesn't catch errors
There was a problem hiding this comment.
I think we reserve the build matrix and VTK for future PRs, as I have not had much time to look into implementing calls to VTK. In that instance, I believe this should be good to merge
There was a problem hiding this comment.
The changes here don't affect the CD so assuming that it will use what is on master then it will publish to PyPI when there is a new release
There was a problem hiding this comment.
Ah makes sense. There aren't many new features since we did the last release, more just fixes, but we could do a new release soon and then all will be well
There was a problem hiding this comment.
Happy new year both! I'll get back to properly creating the distributions and deploying them, thanks!
There was a problem hiding this comment.
Happy new year Kelvin!! I had a look into creating the distributions yesterday as I wanted to play around with cibuildwheel on a code base that has native code. I managed to get it working, but it thinks that SHARPy is a pure python wheel... So, I dug a little deeper and I found that when building the sdist (source distribution) it was building the native components due to this line in the setup.py where it invokes the run() method to build the native libraries. This meant that the source only distributions and the platform-dependent distributions were identical, thereby making it appear to be a pure python wheel. Thus, the fix for this is to somehow specify in the packaging/build process that there are native components and where they are found.
From a quick dive into the setuptools docs to see how setup.py could be modified and skim of the page on building extension modules, I found that it expects the native code to be a module (e.g. sharpy.uvlm.sover for binary that it will load). So, once I saw that it invokes the shared object directly through ctypes for the ffi, I gave up and called it a day. It could be that there is a simple fix/trick to it, but it wasn't immediately obvious from the documentation.
Edit: I also looked at numpy, scikit-learn and pycontrails to see if I could nick some code from them but had no luck
TL;DR - I think creating platform dependent distributions of SHARPy won't be trivial and would take more than a day (depends on how tangled up the build process is and if there is an LLM that knows how to deal with this situation)
There was a problem hiding this comment.
I don't think that it's worth spending too much time on this - the pip install from a git clone is not that troublesome. The only actual need for this is for when I go to Airbus, but I can just repeat the way that Kelvin got it working, even if it isn't the most elegant. If others are happy with this, it might just be a good idea to remove mention of the PyPi route from the docs?
Some tiding up of the installation for v2.4