Skip to content

ENH: Add cosine-basis high-pass-filter to CompCor #2107

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 21 commits into from
Jul 12, 2017

Conversation

effigies
Copy link
Member

@effigies effigies commented Jun 29, 2017

Changes proposed in this pull request

  • Adds a high-pass filter using the discrete cosine transform
  • Enabled with high_pass_filter
  • TR can be specified as an input or derived from the NIfTI header
  • Cosine regressors can be saved

Todo:

@codecov-io
Copy link

codecov-io commented Jun 30, 2017

Codecov Report

Merging #2107 into master will increase coverage by 23.52%.
The diff coverage is 29.76%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master    #2107       +/-   ##
===========================================
+ Coverage   48.56%   72.09%   +23.52%     
===========================================
  Files         116     1144     +1028     
  Lines       23598    57675    +34077     
  Branches        0     8265     +8265     
===========================================
+ Hits        11461    41582    +30121     
- Misses      12137    14786     +2649     
- Partials        0     1307     +1307
Flag Coverage Δ
#smoketests 72.09% <29.76%> (+23.52%) ⬆️
#unittests 69.72% <29.76%> (?)
Impacted Files Coverage Δ
nipype/algorithms/tests/test_auto_ACompCor.py 85.71% <ø> (ø)
nipype/workflows/rsfmri/fsl/resting.py 85.48% <ø> (+70.96%) ⬆️
nipype/algorithms/tests/test_compcor.py 100% <ø> (ø)
nipype/algorithms/tests/test_auto_TCompCor.py 85.71% <ø> (ø)
nipype/algorithms/confounds.py 69.09% <29.76%> (+37.4%) ⬆️
...ipype/interfaces/spm/tests/test_auto_NewSegment.py 85.71% <0%> (ø)
...pe/interfaces/freesurfer/tests/test_auto_Smooth.py 85.71% <0%> (ø)
nipype/interfaces/slicer/registration/brainsfit.py 100% <0%> (ø)
...es/freesurfer/tests/test_auto_MNIBiasCorrection.py 85.71% <0%> (ø)
.../interfaces/utility/tests/test_auto_AssertEqual.py 100% <0%> (ø)
... and 1088 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update af2b7aa...e03cb13. Read the comment docs.

@effigies effigies changed the title [WIP] ENH: Add cosine-basis high-pass-filter to CompCor ENH: Add cosine-basis high-pass-filter to CompCor Jul 3, 2017
@effigies
Copy link
Member Author

effigies commented Jul 3, 2017

This is ready to go.

@satra
Copy link
Member

satra commented Jul 3, 2017

@effigies - the code looks reasonable.

why are we adding high pass filtering to compcor? i can't recollect this being in the behzadi paper. further the polynomial regression achieves a similar effect. it's not a cosine basis, but the functionality is the same.

data = data.reshape((-1, timepoints))

frametimes = timestep * np.arange(timepoints)
X = _full_rank(_cosine_drift(128, frametimes))[0]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this number appears to be coming from the spm default high-pass filter. if this is indeed necessary shouldn't the high-pass filter cutoff be a variable?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I'll add that.

@effigies
Copy link
Member Author

effigies commented Jul 3, 2017

From Behzadi et al.

These low-frequency components can appear in the CompCor spectra because we remove only constant and linear trend terms prior to the principal component analysis. Note that, as stated in the Methods section, the use of the discrete cosine transform (DCT) low-frequency nuisance terms is reserved for the statistical assessments performed with the GLM. As a result, the space spanned by the principal components may partially overlap with the space spanned by the discrete cosine transform nuisance terms.

I'm definitely not an expert on the theory here, and certainly the Legendre basis set seems equally valid. One distinction I would draw between these approaches is in selecting the degree; perhaps there's a way to select the polynomial order based on frequency cutoff, but it's a very natural approach with a DCT basis. At present, the default order is 1, and I think I've seen "order 2 polynomial detrending" in papers to account for nonlinear drift.

This is my first foray into CompCor territory, so I don't have strong opinions here. Arguments are welcome.

Also, I did write this to be "side-by-side" compatible with polynomial detrending, but if it makes more sense to be an xor situation, let me know.

@@ -945,3 +1022,69 @@ def _compute_tSTD(M, x, axis=0):
stdM[stdM == 0] = x
stdM[np.isnan(stdM)] = x
return stdM


# _cosine_drift and _full_rank copied from nipy/modalities/fmri/design_matrix
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a a big disadvantage of depending on nipy here instead of copying the code?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Originally I did it simply to avoid adding a dependency. I ended up augmenting it to set a minimum order of 1.

@chrisgorgo
Copy link
Member

Hi @satra - highpass filtering has been requested as a more explicit form of detrending by an FMRIPREP user. The idea is to capture nontrivial variation with the CompCor components.

Providing a highpass cutoff rather than polynomial order seem to be a more interpretable way to do this kind of detrending. The functionality is optional and should not interfere with the traditional uses.

@satra
Copy link
Member

satra commented Jul 5, 2017

@effigies - it would be good to make regress_poly and high_pass_filter be xor - since they are supposed to achieve similar end goals. if we are going to do high-pass-filter does it make sense to simply do bandpass filter? i don't know why it should be high-pass in particular?

@chrisfilo - on the user side, the goal of compcor is to remove nuisance projectors. to some extent this overlaps with the notion of temporal filtering. the reason a basic polynomial regressor was used was to clearly take out linear/quadratic trends as per the paper. by allowing for more general filtering, users will have to be careful as to what they are really meaning by nuisance and what impact such filtering has on subsequent steps. finally, by bringing more general filtering into the picture, everything about filter design comes into play. something often ignored in the algorithms that implement this.

@chrisgorgo
Copy link
Member

chrisgorgo commented Jul 5, 2017 via email

num_components = traits.Int(6, usedefault=True) # 6 for BOLD, 4 for ASL
use_regress_poly = traits.Bool(True, usedefault=True,
use_regress_poly = traits.Bool(True, usedefault=True, xor=['high_pass_filter'],
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After adding the xor entries, in order to use high_pass_filter, we need to turn off usedefault here, which means we could be breaking existing scripts in one way or another.

Given that, maybe the better way to do this is to switch to an Enum, such as

pre_filter=traits.Enum('polynomial', 'high_pass', False, usedefault=True,
                       desc='Detrend time series prior to component extraction')

WDYT?

@chrisgorgo
Copy link
Member

chrisgorgo commented Jul 5, 2017 via email

@effigies effigies force-pushed the enh/hpf_compcor branch 2 times, most recently from e0d2ad3 to 56ff3c2 Compare July 5, 2017 20:29
@effigies
Copy link
Member Author

effigies commented Jul 5, 2017

Okay. I've moved to an Enum-based filter selection, and deprecated use_regress_poly. It is no longer set True by default, but the pre_filter=='polynomial' default is equivalent. If use_regress_poly==True, pre_filter is overridden. I'm counting on nipype's deprecation warnings to let people know to move off of it.

I'm also generally making the filtering regressors available. There's no cost to keeping them around if people want.

desc='Detrend time series prior to component '
'extraction')
use_regress_poly = traits.Bool(True,
deprecated='0.15.0', new_name='pre_filter',
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@satra I set this to deprecate at 0.15.0, two minor versions in the future. Should that be 3.0.0, instead? It's pretty unclear what a deprecation schedule should look like.

@effigies
Copy link
Member Author

@satra I think this is ready to go, but want your input on the deprecation.

@satra
Copy link
Member

satra commented Jul 12, 2017

@effigies - looks fine to me. at this point it's possible it will get deprecated with the planned 1.0 release in fall. i think that's ok.

@effigies effigies merged commit d843794 into nipy:master Jul 12, 2017
@effigies effigies deleted the enh/hpf_compcor branch July 12, 2017 12:15
effigies added a commit to effigies/niworkflows that referenced this pull request Jul 12, 2017
effigies added a commit to nipreps/niworkflows that referenced this pull request Jul 12, 2017
@satra satra added this to the 0.14.0 milestone Oct 20, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants