Skip to content

scipy.fft interface #269

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 9 commits into from
Jul 26, 2019
Merged

scipy.fft interface #269

merged 9 commits into from
Jul 26, 2019

Conversation

peterbell10
Copy link
Contributor

Closes #268

Since the scipy.fft interface matches numpy.fft very closely, this is simple enough to implement. The exception is the n-dimensional hffts which would require more work than the 1d hfft in the numpy interface

@grlee77
Copy link
Contributor

grlee77 commented Jul 17, 2019

The CI failure here is unrelated (it seems availability of some conda packages for Python 2.7 have changed).

It may be time to just drop Python 2.7 support on master rather than try to fix this, but I will open a separate issue for that.

@grlee77
Copy link
Contributor

grlee77 commented Jul 17, 2019

Once uarray backend support has been added here, please also revise the following section of the docs:

https://github.com/pyFFTW/pyFFTW/blob/master/docs/source/tutorial.rst#monkey-patching-3rd-party-libraries

@peterbell10
Copy link
Contributor Author

uarray support is added and works for me locally. I've updated the tutorial as well.

@hgomersall
Copy link
Collaborator

What're the uarray changes? I'm a bit out of touch on this...

@peterbell10
Copy link
Contributor Author

It was just merged into scipy today (scipy/scipy#10383). It means you can provide the implementation for the functions in scipy.fft without monkey-patching. e.g.

import scipy.fft
scipy.fft.fft([1])  # Calls scipy's own implementation

from pyfftw.interfaces import scipy_fft
scipy.fft.set_global_backend(scipy_fft)
scipy.fft.fft([1])  # Calls into pyfftw

See the uarray docs for a description of __ua_function__. It's very similar to NumPy's new __array_function__ protocol if you've heard of that.

@peterbell10 peterbell10 force-pushed the scipy_fft branch 7 times, most recently from 6a3b487 to 8230ba4 Compare July 17, 2019 20:53
@peterbell10 peterbell10 marked this pull request as ready for review July 18, 2019 10:46
@peterbell10
Copy link
Contributor Author

======================================================================
ERROR: test_scipy_backend (test.test_pyfftw_scipy_fft.InterfacesScipyFFTTestSimple)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/travis/build/pyFFTW/pyFFTW/test/test_pyfftw_scipy_fft.py", line 99, in test_scipy_backend
    with scipy.fft.set_backend(scipy_fft, only=True):
AttributeError: module 'scipy.fft' has no attribute 'set_backend'

It looks like the scipy dev wheels are built weekly so this test will continue to fail until sometime next week. It passes for me locally though.

@hgomersall
Copy link
Collaborator

Does this break without the latest scipy? Is it only this set backend bit?

@peterbell10
Copy link
Contributor Author

The only issue is that the symbol scipy.fft.set_backend doesn't exist and it's used in the test. I could just skip that test and everything should be fine.

@peterbell10
Copy link
Contributor Author

And of course if scipy.fft doesn't exist then everything is excluded and skipped already.

@hgomersall
Copy link
Collaborator

I think I'd prefer the import code to inspect the scipy version to make it explicit for future reading (scipy.__version__). You could use the same with a SkipIf decorator on the test that's failing, then it should work. Do the dev versions bump the version at the beginning of a development cycle?

@peterbell10
Copy link
Contributor Author

I think I'd prefer the import code to inspect the scipy version to make it explicit for future reading (scipy.version).

Done

You could use the same with a SkipIf decorator on the test that's failing, then it should work.

Unfortunately the weekly builds are only identified by part of a commit hash so ordered comparison isn't really possible.

Do the dev versions bump the version at the beginning of a development cycle?

Yes, it's currently on 1.4.0.dev0 while the released version is 1.3.0

@hgomersall
Copy link
Collaborator

Unfortunately the weekly builds are only identified by part of a commit hash so ordered comparison isn't really possible.

Doesn't the version convey the information? I thought the api changes was 1.3 -> 1.4?

@peterbell10
Copy link
Contributor Author

Yes, for release versions that will be sufficient but 1.4 hasn't been released yet.

@hgomersall
Copy link
Collaborator

hgomersall commented Jul 19, 2019

Do the comparisons not work on dev releases? I was under the impression the semantic versioning was meant to allow precisely these sort of comparisons...

@peterbell10
Copy link
Contributor Author

The dev versions are like 1.4.0.dev0+<commit hash> and hashes aren't ordered so comparing hashes with >= doesn't make sense. You can tell that it is a 1.4.0 dev release but you can't tell how far into development it is.

@hgomersall
Copy link
Collaborator

Ah I see. Anyway, it looks fine to me. The problem at the moment then is that is breaks against dev releases, but should work find on non-dev versions?

@peterbell10
Copy link
Contributor Author

Yes, it's only an issue with 1.4.0dev versions from before the uarray features were added. Notice that on travis everything else is passing.

@hgomersall
Copy link
Collaborator

Ok, in that case let's wait for the weekly release and then merge.

@hgomersall
Copy link
Collaborator

Nice work by the way :)

@larsoner
Copy link

@peterbell10 if it makes life easier for this backend work we can probably increment the dev number as 1.4.0dev1

@peterbell10
Copy link
Contributor Author

Unfortunately LooseVersion isn't smart enough to realise 1.4.0 > 1.4.0.dev0 which would complicate the checking further. I think just waiting until Monday is good enough.

@peterbell10
Copy link
Contributor Author

The new wheels are out and the tests are now passing.

Copy link
Contributor

@grlee77 grlee77 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good to me with only minor comments to address

Time with scipy.fftpack: 0.598 seconds
Time with monkey patched scipy_fftpack: 0.251 seconds
Time with scipy.fft default backend: 0.598 seconds
Time with pyfftw backend installed: 0.251 seconds
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you update the numbers here? I am curious if the ratio is currently still similar to that seen here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The gap has closed slightly but I think the main advantage pyfftw has now is multithreading.

@grlee77
Copy link
Contributor

grlee77 commented Jul 25, 2019

@hgomersall, If these changes look good to you I think we are done here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add a scipy.fft interface
4 participants