Skip to content

Commit 027f940

Browse files
committed
Merge pull request #1177 from chrisfilo/enh/circle_ci
Support for CircleCI
2 parents a453b8b + c649832 commit 027f940

32 files changed

+467
-366
lines changed

circle.yml

Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
dependencies:
2+
cache_directories:
3+
- "~/examples/data"
4+
- "~/examples/fsdata"
5+
- "~/examples/feeds"
6+
- "~/mcr"
7+
- "~/spm12"
8+
- "~/examples/fsl_course_data"
9+
override:
10+
- pip install --upgrade pip
11+
- pip install -e .
12+
- pip install matplotlib sphinx ipython
13+
- if [[ ! -d ~/examples/data ]]; then wget "http://tcpdiag.dl.sourceforge.net/project/nipy/nipype/nipype-0.2/nipype-tutorial.tar.bz2"; tar jxvf nipype-tutorial.tar.bz2; mkdir ~/examples; mv nipype-tutorial/* ~/examples/; fi
14+
# we download this manually because CircleCI does not cache apt
15+
- if [[ ! -d ~/examples/feeds ]]; then wget "http://fsl.fmrib.ox.ac.uk/fsldownloads/fsl-5.0.8-feeds.tar.gz"; tar zxvf fsl-5.0.8-feeds.tar.gz; mv feeds ~/examples/; fi
16+
- if [[ ! -d ~/examples/fsl_course_data ]]; then wget -c "http://fsl.fmrib.ox.ac.uk/fslcourse/fdt1.tar.gz" ; wget -c "http://fsl.fmrib.ox.ac.uk/fslcourse/fdt2.tar.gz"; wget -c "http://fsl.fmrib.ox.ac.uk/fslcourse/tbss.tar.gz"; mkdir ~/examples/fsl_course_data; tar zxvf fdt1.tar.gz -C ~/examples/fsl_course_data; tar zxvf fdt2.tar.gz -C ~/examples/fsl_course_data; tar zxvf tbss.tar.gz -C ~/examples/fsl_course_data; fi
17+
- wget -O- http://neuro.debian.net/lists/trusty.us-nh.full | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list
18+
- sudo apt-key adv --recv-keys --keyserver hkp://pgp.mit.edu:80 0xA5D32F012649A5A9
19+
- sudo apt-get update -y; sudo apt-get install -y fsl-core fsl-atlases
20+
- bash ~/nipype/tools/install_spm_mcr.sh
21+
- mkdir ~/.nipype && echo "[logging]" > ~/.nipype/nipype.cfg && echo "workflow_level = DEBUG" >> ~/.nipype/nipype.cfg && echo "interface_level = DEBUG" >> ~/.nipype/nipype.cfg && echo "filemanip_level = DEBUG" >> ~/.nipype/nipype.cfg
22+
test:
23+
override:
24+
- . /usr/share/fsl/5.0/etc/fslconf/fsl.sh && nosetests --with-doctest --logging-level=DEBUG --verbosity=3:
25+
environment:
26+
SPMMCRCMD: "$HOME/spm12/run_spm12.sh $HOME/mcr/v85/ script"
27+
FORCE_SPMMCR: 1
28+
FSL_COURSE_DATA: "$HOME/examples/fsl_course_data"
29+
timeout: 2600
30+
- set -o pipefail && cd doc && make html 2>&1 | tee ~/log.txt
31+
- cat ~/log.txt && if grep -q "ERROR" ~/log.txt; then false; else true; fi
32+
- . /usr/share/fsl/5.0/etc/fslconf/fsl.sh && python ~/nipype/tools/run_examples.py fmri_fsl_feeds Linear l1pipeline:
33+
pwd: ../examples
34+
- . /usr/share/fsl/5.0/etc/fslconf/fsl.sh && python ~/nipype/tools/run_examples.py fmri_spm_dartel Linear level1 l2pipeline:
35+
pwd: ../examples
36+
environment:
37+
SPMMCRCMD: "$HOME/spm12/run_spm12.sh $HOME/mcr/v85/ script"
38+
FORCE_SPMMCR: 1
39+
timeout: 1600
40+
- . /usr/share/fsl/5.0/etc/fslconf/fsl.sh && python ~/nipype/tools/run_examples.py fmri_fsl_reuse Linear level1_workflow:
41+
pwd: ../examples
42+
- . /usr/share/fsl/5.0/etc/fslconf/fsl.sh && python ~/nipype/tools/run_examples.py fmri_spm_nested Linear level1 l2pipeline:
43+
pwd: ../examples
44+
environment:
45+
SPMMCRCMD: "$HOME/spm12/run_spm12.sh $HOME/mcr/v85/ script"
46+
FORCE_SPMMCR: 1
47+
48+
general:
49+
artifacts:
50+
- "doc/_build/html"
51+
- "~/log.txt"

doc/devel/cmd_interface_devel.rst

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -21,15 +21,15 @@ grouped into separate classes (usually suffixed with InputSpec and OutputSpec).
2121
For example:
2222

2323
.. testcode::
24-
24+
2525
class ExampleInputSpec(TraitedSpec):
2626
input_volume = File(desc = "Input volume", exists = True,
2727
mandatory = True)
2828
parameter = traits.Int(desc = "some parameter")
29-
29+
3030
class ExampleOutputSpec(TraitedSpec):
3131
output_volume = File(desc = "Output volume", exists = True)
32-
32+
3333
For the Traits (and Nipype) to work correctly output and input spec has to be
3434
inherited from TraitedSpec (however, this does not have to be direct
3535
inheritance).
@@ -39,7 +39,7 @@ above example we have used the ``desc`` metadata which holds human readable
3939
description of the input. The ``mandatory`` flag forces Nipype to throw an
4040
exception if the input was not set. ``exists`` is a special flag that works only
4141
for ``File traits`` and checks if the provided file exists. More details can be
42-
found at `interface_specs`_.
42+
found at :doc:`interface_specs`.
4343

4444
The input and output specifications have to be connected to the our example
4545
interface class:
@@ -49,13 +49,13 @@ interface class:
4949
class Example(Interface):
5050
input_spec = ExampleInputSpec
5151
output_spec = ExampleOutputSpec
52-
52+
5353
Where the names of the classes grouping inputs and outputs were arbitrary the
5454
names of the fields within the interface they are assigned are not (it always
5555
has to be input_spec and output_spec). Of course this interface does not do much
5656
because we have not specified how to process the inputs and create the outputs.
5757
This can be done in many ways.
58-
58+
5959
Command line executable
6060
=======================
6161

@@ -67,32 +67,32 @@ between the inputs and the generated command line. To achieve this we have
6767
added two metadata: ``argstr`` (string defining how the argument should be
6868
formated) and ``position`` (number defining the order of the arguments).
6969
For example
70-
70+
7171
.. testcode::
7272

7373
class ExampleInputSpec(CommandLineSpec):
7474
input_volume = File(desc = "Input volume", exists = True,
7575
mandatory = True, position = 0, argstr="%s")
7676
parameter = traits.Int(desc = "some parameter", argstr = "--param %d")
77-
77+
7878
As you probably noticed the ``argstr`` is a printf type string with formatting
7979
symbols. For an input defined in InputSpec to be included into the executed
8080
commandline ``argstr`` has to be included. Additionally inside the main
8181
interface class you need to specify the name of the executable by assigning it
8282
to the ``_cmd`` field. Also the main interface class needs to inherit from
83-
`CommandLine`_:
83+
:class:`nipype.interfaces.base.CommandLine`:
8484

8585
.. testcode::
8686

8787
class Example(CommandLine):
8888
_cmd = 'my_command'
8989
input_spec = ExampleInputSpec
9090
output_spec = ExampleOutputSpec
91-
91+
9292
There is one more thing we need to take care of. When the executable finishes
9393
processing it will presumably create some output files. We need to know which
9494
files to look for, check if they exist and expose them to whatever node would
95-
like to use them. This is done by implementing `_list_outputs`_ method in the
95+
like to use them. This is done by implementing ``_list_outputs`` method in the
9696
main interface class. Basically what it does is assigning the expected output
9797
files to the fields of our output spec:
9898

@@ -102,7 +102,7 @@ files to the fields of our output spec:
102102
outputs = self.output_spec().get()
103103
outputs['output_volume'] = os.path.abspath('name_of_the_file_this_cmd_made.nii')
104104
return outputs
105-
105+
106106
Sometimes the inputs need extra parsing before turning into command line
107107
parameters. For example imagine a parameter selecting between three methods:
108108
"old", "standard" and "new". Imagine also that the command line accept this as
@@ -122,42 +122,42 @@ numbers. We need to do additional parsing by overloading the following method
122122
in the main interface class:
123123

124124
.. testcode::
125-
125+
126126
def _format_arg(self, name, spec, value):
127127
if name == 'method':
128128
return spec.argstr%{"old":0, "standard":1, "new":2}[value]
129129
return super(Example, self)._format_arg(name, spec, value)
130-
130+
131131
Here is a minimalistic interface for the gzip command:
132132

133133
.. testcode::
134-
134+
135135
from nipype.interfaces.base import (
136-
TraitedSpec,
136+
TraitedSpec,
137137
CommandLineInputSpec,
138-
CommandLine,
138+
CommandLine,
139139
File
140140
)
141141
import os
142-
142+
143143
class GZipInputSpec(CommandLineInputSpec):
144144
input_file = File(desc="File", exists=True, mandatory=True, argstr="%s")
145-
145+
146146
class GZipOutputSpec(TraitedSpec):
147147
output_file = File(desc = "Zip file", exists = True)
148-
148+
149149
class GZipTask(CommandLine):
150150
input_spec = GZipInputSpec
151151
output_spec = GZipOutputSpec
152152
cmd = 'gzip'
153-
153+
154154
def _list_outputs(self):
155155
outputs = self.output_spec().get()
156156
outputs['output_file'] = os.path.abspath(self.inputs.input_file + ".gz")
157157
return outputs
158-
158+
159159
if __name__ == '__main__':
160-
160+
161161
zipper = GZipTask(input_file='an_existing_file')
162162
print zipper.cmdline
163163
zipper.run()
@@ -193,9 +193,9 @@ hash_files
193193

194194
name_template (optional)
195195
overrides the default ``_generated`` suffix
196-
196+
197197
output_name (optional)
198-
name of the output (if this is not set same name as the input will be
198+
name of the output (if this is not set same name as the input will be
199199
assumed)
200200

201201
keep_extension (optional - not used)

doc/index.rst

Lines changed: 25 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -4,31 +4,31 @@
44
:width: 100 %
55
- .. container::
66

7-
Current neuroimaging software offer users an incredible opportunity to
8-
analyze data using a variety of different algorithms. However, this has
9-
resulted in a heterogeneous collection of specialized applications
10-
without transparent interoperability or a uniform operating interface.
11-
12-
*Nipype*, an open-source, community-developed initiative under the
13-
umbrella of NiPy_, is a Python project that provides a uniform interface
14-
to existing neuroimaging software and facilitates interaction between
15-
these packages within a single workflow. Nipype provides an environment
16-
that encourages interactive exploration of algorithms from different
17-
packages (e.g., ANTS_, SPM_, FSL_, FreeSurfer_, Camino_, MRtrix_, MNE_, AFNI_,
18-
Slicer_), eases the design of workflows within and between packages, and
19-
reduces the learning curve necessary to use different packages. Nipype is
20-
creating a collaborative platform for neuroimaging software development
21-
in a high-level language and addressing limitations of existing pipeline
22-
systems.
23-
24-
*Nipype* allows you to:
25-
26-
* easily interact with tools from different software packages
27-
* combine processing steps from different software packages
28-
* develop new workflows faster by reusing common steps from old ones
29-
* process data faster by running it in parallel on many cores/machines
30-
* make your research easily reproducible
31-
* share your processing workflows with the community
7+
Current neuroimaging software offer users an incredible opportunity to
8+
analyze data using a variety of different algorithms. However, this has
9+
resulted in a heterogeneous collection of specialized applications
10+
without transparent interoperability or a uniform operating interface.
11+
12+
*Nipype*, an open-source, community-developed initiative under the
13+
umbrella of NiPy_, is a Python project that provides a uniform interface
14+
to existing neuroimaging software and facilitates interaction between
15+
these packages within a single workflow. Nipype provides an environment
16+
that encourages interactive exploration of algorithms from different
17+
packages (e.g., ANTS_, SPM_, FSL_, FreeSurfer_, Camino_, MRtrix_, MNE_, AFNI_,
18+
Slicer_), eases the design of workflows within and between packages, and
19+
reduces the learning curve necessary to use different packages. Nipype is
20+
creating a collaborative platform for neuroimaging software development
21+
in a high-level language and addressing limitations of existing pipeline
22+
systems.
23+
24+
*Nipype* allows you to:
25+
26+
* easily interact with tools from different software packages
27+
* combine processing steps from different software packages
28+
* develop new workflows faster by reusing common steps from old ones
29+
* process data faster by running it in parallel on many cores/machines
30+
* make your research easily reproducible
31+
* share your processing workflows with the community
3232

3333
.. admonition:: Reference
3434

doc/users/config_file.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -80,8 +80,8 @@ Execution
8080
*remove_unnecessary_outputs*
8181
This will remove any interface outputs not needed by the workflow. If the
8282
required outputs from a node changes, rerunning the workflow will rerun the
83-
node. Outputs of leaf nodes (nodes whose outputs are not connected to any
84-
other nodes) will never be deleted independent of this parameter. (possible
83+
node. Outputs of leaf nodes (nodes whose outputs are not connected to any
84+
other nodes) will never be deleted independent of this parameter. (possible
8585
values: ``true`` and ``false``; default value: ``true``)
8686

8787
*try_hard_link_datasink*
@@ -129,7 +129,7 @@ Execution
129129
If this is set to True, the node's output directory will contain full
130130
parameterization of any iterable, otherwise parameterizations over 32
131131
characters will be replaced by their hash. (possible values: ``true`` and
132-
``false``; default value: ``true``)
132+
``false``; default value: ``true``)
133133

134134
*poll_sleep_duration*
135135
This controls how long the job submission loop will sleep between submitting
@@ -146,7 +146,7 @@ Example
146146

147147
[logging]
148148
workflow_level = DEBUG
149-
149+
150150
[execution]
151151
stop_on_first_crash = true
152152
hash_method = timestamp
@@ -156,9 +156,9 @@ Workflow.config property has a form of a nested dictionary reflecting the
156156
structure of the .cfg file.
157157

158158
::
159-
159+
160160
myworkflow = pe.Workflow()
161-
myworkflow.config['execution'] = {'stop_on_first_rerun': 'True',
161+
myworkflow.config['execution'] = {'stop_on_first_rerun': 'True',
162162
'hash_method': 'timestamp'}
163163

164164
You can also directly set global config options in your workflow script. An

doc/users/plugins.rst

Lines changed: 17 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ a local system.
7171

7272
Optional arguments::
7373

74-
n_procs : Number of processes to launch in parallel, if not set number of
74+
n_procs : Number of processes to launch in parallel, if not set number of
7575
processors/threads will be automatically detected
7676

7777
To distribute processing on a multicore machine, simply call::
@@ -118,7 +118,7 @@ Optional arguments::
118118

119119
For example, the following snippet executes the workflow on myqueue with
120120
a custom template::
121-
121+
122122
workflow.run(plugin='SGE',
123123
plugin_args=dict(template='mytemplate.sh', qsub_args='-q myqueue')
124124

@@ -136,18 +136,18 @@ particular node might use more resources than other nodes in a workflow.
136136
this local configuration::
137137

138138
node.plugin_args = {'qsub_args': '-l nodes=1:ppn=3', 'overwrite': True}
139-
139+
140140
SGEGraph
141141
~~~~~~~~
142142
SGEGraph_ is an execution plugin working with Sun Grid Engine that allows for
143-
submitting entire graph of dependent jobs at once. This way Nipype does not
143+
submitting entire graph of dependent jobs at once. This way Nipype does not
144144
need to run a monitoring process - SGE takes care of this. The use of SGEGraph_
145-
is preferred over SGE_ since the latter adds unnecessary load on the submit
145+
is preferred over SGE_ since the latter adds unnecessary load on the submit
146146
machine.
147147

148148
.. note::
149149

150-
When rerunning unfinished workflows using SGEGraph you may decide not to
150+
When rerunning unfinished workflows using SGEGraph you may decide not to
151151
submit jobs for Nodes that previously finished running. This can speed up
152152
execution, but new or modified inputs that would previously trigger a Node
153153
to rerun will be ignored. The following option turns on this functionality::
@@ -177,20 +177,20 @@ Optional arguments::
177177

178178
template: custom template file to use
179179
sbatch_args: any other command line args to be passed to bsub.
180-
181-
180+
181+
182182
SLURMGraph
183183
~~~~~~~~~~
184184
SLURMGraph_ is an execution plugin working with SLURM that allows for
185-
submitting entire graph of dependent jobs at once. This way Nipype does not
185+
submitting entire graph of dependent jobs at once. This way Nipype does not
186186
need to run a monitoring process - SLURM takes care of this. The use of SLURMGraph_
187-
plugin is preferred over the vanilla SLURM_ plugin since the latter adds
188-
unnecessary load on the submit machine.
187+
plugin is preferred over the vanilla SLURM_ plugin since the latter adds
188+
unnecessary load on the submit machine.
189189

190190

191191
.. note::
192192

193-
When rerunning unfinished workflows using SLURMGraph you may decide not to
193+
When rerunning unfinished workflows using SLURMGraph you may decide not to
194194
submit jobs for Nodes that previously finished running. This can speed up
195195
execution, but new or modified inputs that would previously trigger a Node
196196
to rerun will be ignored. The following option turns on this functionality::
@@ -205,11 +205,11 @@ DAGMan
205205
~~~~~~
206206

207207
With its DAGMan_ component HTCondor_ (previously Condor) allows for submitting
208-
entire graphs of dependent jobs at once (similar to SGEGraph_ and SLURMGaaoh_).
209-
With the ``CondorDAGMan`` plug-in Nipype can utilize this functionality to
210-
submit complete workflows directly and in a single step. Consequently, and
211-
in contrast to other plug-ins, workflow execution returns almost
212-
instantaneously -- Nipype is only used to generate the workflow graph,
208+
entire graphs of dependent jobs at once (similar to SGEGraph_ and SLURMGraph_).
209+
With the ``CondorDAGMan`` plug-in Nipype can utilize this functionality to
210+
submit complete workflows directly and in a single step. Consequently, and
211+
in contrast to other plug-ins, workflow execution returns almost
212+
instantaneously -- Nipype is only used to generate the workflow graph,
213213
while job scheduling and dependency resolution are entirely managed by HTCondor_.
214214

215215
Please note that although DAGMan_ supports specification of data dependencies
@@ -320,4 +320,3 @@ Optional arguments::
320320
.. _HTCondor documentation: http://research.cs.wisc.edu/htcondor/manual
321321
.. _DMTCP: http://dmtcp.sourceforge.net
322322
.. _SLURM: http://slurm.schedmd.com/
323-

0 commit comments

Comments
 (0)