Skip to content

Commit f66a5a6

Browse files
authored
Merge pull request #1981 from oesteban/fix/1933
[FIX] Ensure build fails in Circle when tests fail
2 parents c26b3de + 8d530d3 commit f66a5a6

File tree

6 files changed

+123
-96
lines changed

6 files changed

+123
-96
lines changed

.circle/tests.sh

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,25 +19,33 @@ case ${CIRCLE_NODE_INDEX} in
1919
0)
2020
docker run --rm=false -it -e FSL_COURSE_DATA="/data/examples/nipype-fsl_course_data" -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_pytests.sh && \
2121
docker run --rm=false -it -e FSL_COURSE_DATA="/data/examples/nipype-fsl_course_data" -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py27 /usr/bin/run_pytests.sh && \
22-
docker run --rm=false -it -v $WORKDIR:/work -w /src/nipype/doc nipype/nipype:py36 /usr/bin/run_builddocs.sh && \
22+
docker run --rm=false -it -v $WORKDIR:/work -w /src/nipype/doc --entrypoint=/usr/bin/run_builddocs.sh nipype/nipype:py36 /usr/bin/run_builddocs.sh && \
2323
docker run --rm=false -it -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh test_spm Linear /data/examples/ workflow3d && \
2424
docker run --rm=false -it -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh test_spm Linear /data/examples/ workflow4d
25+
exitcode=$?
2526
;;
2627
1)
2728
docker run --rm=false -it -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh fmri_spm_dartel Linear /data/examples/ level1 && \
2829
docker run --rm=false -it -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh fmri_spm_dartel Linear /data/examples/ l2pipeline
30+
exitcode=$?
2931
;;
3032
2)
3133
docker run --rm=false -it -e NIPYPE_NUMBER_OF_CPUS=4 -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py27 /usr/bin/run_examples.sh fmri_spm_nested MultiProc /data/examples/ level1 && \
3234
docker run --rm=false -it -e NIPYPE_NUMBER_OF_CPUS=4 -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh fmri_spm_nested MultiProc /data/examples/ l2pipeline
35+
exitcode=$?
3336
;;
3437
3)
3538
docker run --rm=false -it -e NIPYPE_NUMBER_OF_CPUS=4 -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh fmri_spm_nested MultiProc /data/examples/ level1 && \
3639
docker run --rm=false -it -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh fmri_fsl_feeds Linear /data/examples/ l1pipeline && \
3740
docker run --rm=false -it -v $HOME/examples:/data/examples:ro -v $WORKDIR:/work -w /work nipype/nipype:py36 /usr/bin/run_examples.sh fmri_fsl_reuse Linear /data/examples/ level1_workflow
41+
exitcode=$?
3842
;;
3943
esac
4044

4145
cp ${WORKDIR}/tests/*.xml ${CIRCLE_TEST_REPORTS}/tests/
46+
47+
# Exit with error if any of the tests failed
48+
if [ "$exitcode" != "0" ]; then exit 1; fi
4249
codecov -f "coverage*.xml" -s "${WORKDIR}/tests/" -R "${HOME}/nipype/" -F unittests -e CIRCLE_NODE_INDEX
4350
codecov -f "smoketest*.xml" -s "${WORKDIR}/tests/" -R "${HOME}/nipype/" -F smoketests -e CIRCLE_NODE_INDEX
51+

CHANGES

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
Upcoming Release
22
=====================
33

4+
* FIX: Ensure build fails in Circle when tests fail (https://github.com/nipy/nipype/pull/1981)
45
* ENH: Add interface to antsAffineInitializer (https://github.com/nipy/nipype/pull/1980)
56
* ENH: AFNI motion parameter support for FrameWiseDisplacement (https://github.com/nipy/nipype/pull/1840)
67
* ENH: Add ANTs KellyKapowski interface (https://github.com/nipy/nipype/pull/1845)

docker/files/run_pytests.sh

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,4 +40,6 @@ fi
4040
# Collect crashfiles
4141
find ${WORKDIR} -name "crash-*" -maxdepth 1 -exec mv {} ${WORKDIR}/crashfiles/ \;
4242

43+
echo "Unit tests finished with exit code ${exit_code}"
4344
exit ${exit_code}
45+

nipype/algorithms/confounds.py

Lines changed: 88 additions & 81 deletions
Original file line numberDiff line numberDiff line change
@@ -330,7 +330,7 @@ class CompCorOutputSpec(TraitedSpec):
330330
desc='text file containing the noise components')
331331

332332
class CompCor(BaseInterface):
333-
'''
333+
"""
334334
Interface with core CompCor computation, used in aCompCor and tCompCor
335335
336336
Example
@@ -342,7 +342,8 @@ class CompCor(BaseInterface):
342342
>>> ccinterface.inputs.num_components = 1
343343
>>> ccinterface.inputs.use_regress_poly = True
344344
>>> ccinterface.inputs.regress_poly_degree = 2
345-
'''
345+
346+
"""
346347
input_spec = CompCorInputSpec
347348
output_spec = CompCorOutputSpec
348349
references_ = [{'entry': BibTeX("@article{compcor_2007,"
@@ -465,8 +466,11 @@ def _make_headers(self, num_col):
465466

466467

467468
class ACompCor(CompCor):
468-
''' Anatomical compcor; for input/output, see CompCor.
469-
If the mask provided is an anatomical mask, CompCor == ACompCor '''
469+
"""
470+
Anatomical compcor: for inputs and outputs, see CompCor.
471+
When the mask provided is an anatomical mask, then CompCor
472+
is equivalent to ACompCor.
473+
"""
470474

471475
def __init__(self, *args, **kwargs):
472476
''' exactly the same as compcor except the header '''
@@ -492,7 +496,7 @@ class TCompCorOutputSpec(CompCorInputSpec):
492496
desc="voxels excedding the variance threshold"))
493497

494498
class TCompCor(CompCor):
495-
'''
499+
"""
496500
Interface for tCompCor. Computes a ROI mask based on variance of voxels.
497501
498502
Example
@@ -505,7 +509,8 @@ class TCompCor(CompCor):
505509
>>> ccinterface.inputs.use_regress_poly = True
506510
>>> ccinterface.inputs.regress_poly_degree = 2
507511
>>> ccinterface.inputs.percentile_threshold = .03
508-
'''
512+
513+
"""
509514

510515
input_spec = TCompCorInputSpec
511516
output_spec = TCompCorOutputSpec
@@ -634,7 +639,8 @@ class TSNROutputSpec(TraitedSpec):
634639

635640

636641
class TSNR(BaseInterface):
637-
"""Computes the time-course SNR for a time series
642+
"""
643+
Computes the time-course SNR for a time series
638644
639645
Typically you want to run this on a realigned time-series.
640646
@@ -719,80 +725,6 @@ def _run_interface(self, runtime):
719725
def _list_outputs(self):
720726
return self._results
721727

722-
def is_outlier(points, thresh=3.5):
723-
"""
724-
Returns a boolean array with True if points are outliers and False
725-
otherwise.
726-
727-
Parameters:
728-
-----------
729-
points : An numobservations by numdimensions array of observations
730-
thresh : The modified z-score to use as a threshold. Observations with
731-
a modified z-score (based on the median absolute deviation) greater
732-
than this value will be classified as outliers.
733-
734-
Returns:
735-
--------
736-
mask : A numobservations-length boolean array.
737-
738-
References:
739-
----------
740-
Boris Iglewicz and David Hoaglin (1993), "Volume 16: How to Detect and
741-
Handle Outliers", The ASQC Basic References in Quality Control:
742-
Statistical Techniques, Edward F. Mykytka, Ph.D., Editor.
743-
"""
744-
if len(points.shape) == 1:
745-
points = points[:, None]
746-
median = np.median(points, axis=0)
747-
diff = np.sum((points - median) ** 2, axis=-1)
748-
diff = np.sqrt(diff)
749-
med_abs_deviation = np.median(diff)
750-
751-
modified_z_score = 0.6745 * diff / med_abs_deviation
752-
753-
timepoints_to_discard = 0
754-
for i in range(len(modified_z_score)):
755-
if modified_z_score[i] <= thresh:
756-
break
757-
else:
758-
timepoints_to_discard += 1
759-
760-
return timepoints_to_discard
761-
762-
763-
def regress_poly(degree, data, remove_mean=True, axis=-1):
764-
''' returns data with degree polynomial regressed out.
765-
Be default it is calculated along the last axis (usu. time).
766-
If remove_mean is True (default), the data is demeaned (i.e. degree 0).
767-
If remove_mean is false, the data is not.
768-
'''
769-
IFLOG.debug('Performing polynomial regression on data of shape ' + str(data.shape))
770-
771-
datashape = data.shape
772-
timepoints = datashape[axis]
773-
774-
# Rearrange all voxel-wise time-series in rows
775-
data = data.reshape((-1, timepoints))
776-
777-
# Generate design matrix
778-
X = np.ones((timepoints, 1)) # quick way to calc degree 0
779-
for i in range(degree):
780-
polynomial_func = Legendre.basis(i + 1)
781-
value_array = np.linspace(-1, 1, timepoints)
782-
X = np.hstack((X, polynomial_func(value_array)[:, np.newaxis]))
783-
784-
# Calculate coefficients
785-
betas = np.linalg.pinv(X).dot(data.T)
786-
787-
# Estimation
788-
if remove_mean:
789-
datahat = X.dot(betas).T
790-
else: # disregard the first layer of X, which is degree 0
791-
datahat = X[:, 1:].dot(betas[1:, ...]).T
792-
regressed_data = data - datahat
793-
794-
# Back to original shape
795-
return regressed_data.reshape(datashape)
796728

797729
def compute_dvars(in_file, in_mask, remove_zerovariance=False,
798730
intensity_normalization=1000):
@@ -921,3 +853,78 @@ def plot_confound(tseries, figsize, name, units=None,
921853
ax.set_ylim(ylim)
922854
ax.set_yticklabels([])
923855
return fig
856+
857+
def is_outlier(points, thresh=3.5):
858+
"""
859+
Returns a boolean array with True if points are outliers and False
860+
otherwise.
861+
862+
:param nparray points: an numobservations by numdimensions numpy array of observations
863+
:param float thresh: the modified z-score to use as a threshold. Observations with
864+
a modified z-score (based on the median absolute deviation) greater
865+
than this value will be classified as outliers.
866+
867+
:return: A bolean mask, of size numobservations-length array.
868+
869+
.. note:: References
870+
871+
Boris Iglewicz and David Hoaglin (1993), "Volume 16: How to Detect and
872+
Handle Outliers", The ASQC Basic References in Quality Control:
873+
Statistical Techniques, Edward F. Mykytka, Ph.D., Editor.
874+
875+
"""
876+
if len(points.shape) == 1:
877+
points = points[:, None]
878+
median = np.median(points, axis=0)
879+
diff = np.sum((points - median) ** 2, axis=-1)
880+
diff = np.sqrt(diff)
881+
med_abs_deviation = np.median(diff)
882+
883+
modified_z_score = 0.6745 * diff / med_abs_deviation
884+
885+
timepoints_to_discard = 0
886+
for i in range(len(modified_z_score)):
887+
if modified_z_score[i] <= thresh:
888+
break
889+
else:
890+
timepoints_to_discard += 1
891+
892+
return timepoints_to_discard
893+
894+
895+
def regress_poly(degree, data, remove_mean=True, axis=-1):
896+
"""
897+
Returns data with degree polynomial regressed out.
898+
899+
:param bool remove_mean: whether or not demean data (i.e. degree 0),
900+
:param int axis: numpy array axes along which regression is performed
901+
902+
"""
903+
IFLOG.debug('Performing polynomial regression on data of shape ' + str(data.shape))
904+
905+
datashape = data.shape
906+
timepoints = datashape[axis]
907+
908+
# Rearrange all voxel-wise time-series in rows
909+
data = data.reshape((-1, timepoints))
910+
911+
# Generate design matrix
912+
X = np.ones((timepoints, 1)) # quick way to calc degree 0
913+
for i in range(degree):
914+
polynomial_func = Legendre.basis(i + 1)
915+
value_array = np.linspace(-1, 1, timepoints)
916+
X = np.hstack((X, polynomial_func(value_array)[:, np.newaxis]))
917+
918+
# Calculate coefficients
919+
betas = np.linalg.pinv(X).dot(data.T)
920+
921+
# Estimation
922+
if remove_mean:
923+
datahat = X.dot(betas).T
924+
else: # disregard the first layer of X, which is degree 0
925+
datahat = X[:, 1:].dot(betas[1:, ...]).T
926+
regressed_data = data - datahat
927+
928+
# Back to original shape
929+
return regressed_data.reshape(datashape)
930+

nipype/tests/test_nipype.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,8 @@ def test_nipype_info():
66
get_info()
77
except Exception as e:
88
exception_not_raised = False
9-
assert exception_not_raised
9+
assert exception_not_raised
10+
11+
# def test_fail_always():
12+
# assert False
13+

nipype/workflows/smri/niftyreg/groupwise.py

Lines changed: 18 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,10 @@
11
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
22
# vi: set ft=python sts=4 ts=4 sw=4 et:
33

4-
'''
5-
This file provides some common registration routines useful for a variety of
6-
pipelines.
7-
8-
Including linear and non-linear image co-registration
9-
'''
4+
"""
5+
Example of registration workflows using niftyreg, useful for a variety of
6+
pipelines. Including linear and non-linear image co-registration
7+
"""
108

119
from builtins import str, range
1210
import nipype.interfaces.utility as niu
@@ -20,7 +18,7 @@ def create_linear_gw_step(name="linear_gw_niftyreg",
2018
use_mask=False,
2119
verbose=False):
2220
"""
23-
Creates a workflow that perform linear co-registration of a set of images
21+
Creates a workflow that performs linear co-registration of a set of images
2422
using RegAladin, producing an average image and a set of affine
2523
transformation matrices linking each of the floating images to the average.
2624
@@ -38,6 +36,7 @@ def create_linear_gw_step(name="linear_gw_niftyreg",
3836
outputspec.aff_files - The affine transformation files
3937
4038
Optional arguments::
39+
4140
linear_options_hash - An options dictionary containing a list of
4241
parameters for RegAladin that take
4342
the same form as given in the interface (default None)
@@ -51,8 +50,8 @@ def create_linear_gw_step(name="linear_gw_niftyreg",
5150
5251
>>> from nipype.workflows.smri.niftyreg import create_linear_gw_step
5352
>>> lgw = create_linear_gw_step('my_linear_coreg') # doctest: +SKIP
54-
>>> lgw.inputs.inputspec.in_files = ['file1.nii.gz', 'file2.nii.gz'] \
55-
# doctest: +SKIP
53+
>>> lgw.inputs.inputspec.in_files = [
54+
... 'file1.nii.gz', 'file2.nii.gz'] # doctest: +SKIP
5655
>>> lgw.inputs.inputspec.ref_file = ['ref.nii.gz'] # doctest: +SKIP
5756
>>> lgw.run() # doctest: +SKIP
5857
@@ -121,6 +120,7 @@ def create_nonlinear_gw_step(name="nonlinear_gw_niftyreg",
121120
cpp transformation linking each of the floating images to the average.
122121
123122
Inputs::
123+
124124
inputspec.in_files - The input files to be registered
125125
inputspec.ref_file - The initial reference image that the input files
126126
are registered to
@@ -134,6 +134,7 @@ def create_nonlinear_gw_step(name="nonlinear_gw_niftyreg",
134134
outputspec.cpp_files - The bspline transformation files
135135
136136
Optional arguments::
137+
137138
nonlinear_options_hash - An options dictionary containing a list of
138139
parameters for RegAladin that take the
139140
same form as given in the interface (default None)
@@ -144,8 +145,8 @@ def create_nonlinear_gw_step(name="nonlinear_gw_niftyreg",
144145
-------
145146
>>> from nipype.workflows.smri.niftyreg import create_nonlinear_gw_step
146147
>>> nlc = create_nonlinear_gw_step('nonlinear_coreg') # doctest: +SKIP
147-
>>> nlc.inputs.inputspec.in_files = ['file1.nii.gz', 'file2.nii.gz'] \
148-
# doctest: +SKIP
148+
>>> nlc.inputs.inputspec.in_files = [
149+
... 'file1.nii.gz', 'file2.nii.gz'] # doctest: +SKIP
149150
>>> nlc.inputs.inputspec.ref_file = ['ref.nii.gz'] # doctest: +SKIP
150151
>>> nlc.run() # doctest: +SKIP
151152
@@ -246,6 +247,7 @@ def create_groupwise_average(name="atlas_creation",
246247
non-linear components.
247248
248249
Inputs::
250+
249251
inputspec.in_files - The input files to be registered
250252
inputspec.ref_file - The initial reference image that the input files
251253
are registered to
@@ -258,12 +260,14 @@ def create_groupwise_average(name="atlas_creation",
258260
outputspec.average_image - The average image
259261
outputspec.cpp_files - The bspline transformation files
260262
263+
261264
Example
262265
-------
266+
263267
>>> from nipype.workflows.smri.niftyreg import create_groupwise_average
264268
>>> node = create_groupwise_average('groupwise_av') # doctest: +SKIP
265-
>>> node.inputs.inputspec.in_files = ['file1.nii.gz', 'file2.nii.gz'] \
266-
# doctest: +SKIP
269+
>>> node.inputs.inputspec.in_files = [
270+
... 'file1.nii.gz', 'file2.nii.gz'] # doctest: +SKIP
267271
>>> node.inputs.inputspec.ref_file = ['ref.nii.gz'] # doctest: +SKIP
268272
>>> node.inputs.inputspec.rmask_file = ['mask.nii.gz'] # doctest: +SKIP
269273
>>> node.run() # doctest: +SKIP
@@ -384,3 +388,4 @@ def create_groupwise_average(name="atlas_creation",
384388
])
385389

386390
return workflow
391+

0 commit comments

Comments
 (0)