NIPY logo
Home · Quickstart · Documentation · Citation · NiPy
Loading

Table Of Contents

Versions

ReleaseDevel
0.6.0pre-0.7
Download Github

Links

pipeline.engine

Module: pipeline.engine

Inheritance diagram for nipype.pipeline.engine:

Inheritance diagram of nipype.pipeline.engine

Defines functionality for pipelined execution of interfaces

The Pipeline class provides core functionality for batch processing.

Change directory to provide relative paths for doctests >>> import os >>> filepath = os.path.dirname( os.path.realpath( __file__ ) ) >>> datadir = os.path.realpath(os.path.join(filepath, ‘../testing/data’)) >>> os.chdir(datadir)

Classes

MapNode

class nipype.pipeline.engine.MapNode(interface, iterfield=None, **kwargs)

Bases: nipype.pipeline.engine.Node

Wraps interface objects that need to be iterated on a list of inputs.

Examples

>>> import nipype.interfaces.fsl as fsl
>>> realign = MapNode(interface=fsl.MCFLIRT(), name='realign', iterfield=['in_file']) 
>>> realign.inputs.in_file = ['functional.nii', 'functional2.nii', 'functional3.nii'] 
>>> realign.run() 
__init__(interface, iterfield=None, **kwargs)
Parameters :

iterfield : 1+-element list

key(s) over which to repeatedly call the interface. :

for example, to iterate FSL.Bet over multiple files, one can :

set node.iterfield = [‘infile’]. If this list has more than 1 item :

then the inputs are selected in order simultaneously from each of these :

fields and each field will need to have the same number of members. :

clone(name)

Clone a workflowbase object

Parameters :

name : string (mandatory)

A clone of node or workflow must have a new name

fullname
get_output(parameter)

Retrieve a particular output of the node

get_subnodes()
hash_exists(updatehash=False)
help()

Print interface help

inputs
interface

Return the underlying interface object

load(filename)
num_subnodes()
output_dir()

Return the location of the output directory for the node

outputs
result

Return the result object after the node has run

run(updatehash=False)

Execute the node in its directory.

Parameters :

updatehash: boolean :

Update the hash stored in the output directory

save(filename=None)
set_input(parameter, val)

Set interface input value or nodewrapper attribute

Priority goes to interface.

update(**opts)
write_report(report_type=None, cwd=None)

Node

class nipype.pipeline.engine.Node(interface, iterables=None, overwrite=None, needed_outputs=None, run_without_submitting=False, **kwargs)

Bases: nipype.pipeline.engine.WorkflowBase

Wraps interface objects for use in pipeline

A Node creates a sandbox-like directory for executing the underlying interface. It will copy or link inputs into this directory to ensure that input data are not overwritten. A hash of the input state is used to determine if the Node inputs have changed and whether the node needs to be re-executed.

Examples

>>> import nipype.interfaces.spm as spm
>>> realign = Node(interface=spm.Realign(), name='realign')
>>> realign.inputs.in_files = 'functional.nii'
>>> realign.inputs.register_to_mean = True
>>> realign.run() 
__init__(interface, iterables=None, overwrite=None, needed_outputs=None, run_without_submitting=False, **kwargs)
Parameters :

interface : interface object

node specific interface (fsl.Bet(), spm.Coregister())

iterables : generator

input field and list to iterate using the pipeline engine for example to iterate over different frac values in fsl.Bet() for a single field the input can be a tuple, otherwise a list of tuples node.iterables = (‘frac’,[0.5,0.6,0.7]) node.iterables = [(‘fwhm’,[2,4]),(‘fieldx’,[0.5,0.6,0.7])]

overwrite : Boolean

Whether to overwrite contents of output directory if it already exists. If directory exists and hash matches it assumes that process has been executed

needed_outputs : list of output_names

Force the node to keep only specific outputs. By default all outputs are kept. Setting this attribute will delete any output files and directories from the node’s working directory that are not part of the needed_outputs.

run_without_submitting : boolean

Run the node without submitting to a job engine or to a multiprocessing pool

clone(name)

Clone a workflowbase object

Parameters :

name : string (mandatory)

A clone of node or workflow must have a new name

fullname
get_output(parameter)

Retrieve a particular output of the node

hash_exists(updatehash=False)
help()

Print interface help

inputs

Return the inputs of the underlying interface

interface

Return the underlying interface object

load(filename)
output_dir()

Return the location of the output directory for the node

outputs

Return the output fields of the underlying interface

result

Return the result object after the node has run

run(updatehash=False)

Execute the node in its directory.

Parameters :

updatehash: boolean :

Update the hash stored in the output directory

save(filename=None)
set_input(parameter, val)

Set interface input value

update(**opts)
write_report(report_type=None, cwd=None)

Workflow

class nipype.pipeline.engine.Workflow(**kwargs)

Bases: nipype.pipeline.engine.WorkflowBase

Controls the setup and execution of a pipeline of processes

__init__(**kwargs)
add_nodes(nodes)

Add nodes to a workflow

Parameters :

nodes : list

A list of WorkflowBase-based objects

clone(name)

Clone a workflow

Note

Will reset attributes used for executing workflow. See _init_runtime_fields.

Parameters :

name: string (mandatory ) :

every clone requires a new name

connect(*args, **kwargs)

Connect nodes in the pipeline.

This routine also checks if inputs and outputs are actually provided by the nodes that are being connected.

Creates edges in the directed graph using the nodes and edges specified in the connection_list. Uses the NetworkX method DiGraph.add_edges_from.

Parameters :

args : list or a set of four positional arguments

Four positional arguments of the form:

connect(source, sourceoutput, dest, destinput)

source : nodewrapper node sourceoutput : string (must be in source.outputs) dest : nodewrapper node destinput : string (must be in dest.inputs)

A list of 3-tuples of the following form:

[(source, target,
    [('sourceoutput/attribute', 'targetinput'),
    ...]),
...]

Or:

[(source, target, [(('sourceoutput1', func, arg2, ...),
                            'targetinput'), ...]),
...]
sourceoutput1 will always be the first argument to func
and func will be evaluated and the results sent ot targetinput

currently func needs to define all its needed imports within the
function as we use the inspect module to get at the source code
and execute it remotely
disconnect(*args)

Disconnect two nodes

See the docstring for connect for format.

fullname
get_node(name)

Return an internal node by name

inputs
list_node_names()

List names of all nodes in a workflow

load(filename)
outputs
remove_nodes(nodes)

Remove nodes from a workflow

Parameters :

nodes : list

A list of WorkflowBase-based objects

run(plugin=None, plugin_args=None, updatehash=False)

Execute the workflow

Parameters :

plugin: plugin name or object :

Plugin to use for execution. You can create your own plugins for execution.

plugin_args : dictionary containing arguments to be sent to plugin

constructor. see individual plugin doc strings for details.

save(filename=None)
write_graph(dotfilename='graph.dot', graph2use='hierarchical', format='png', simple_form=True)

Generates a graphviz dot file and a png file

Parameters :

graph2use: ‘orig’, ‘hierarchical’ (default), ‘flat’, ‘exec’ :

orig - creates a top level graph without expanding internal workflow nodes; flat - expands workflow nodes recursively; exec - expands workflows to depict iterables

format: ‘png’, ‘svg’ :

simple_form: boolean (default: True) :

Determines if the node name used in the graph should be of the form ‘nodename (package)’ when True or ‘nodename.Class.package’ when False.

write_hierarchical_dotfile(dotfilename=None, colored=True, simple_form=True)

WorkflowBase

class nipype.pipeline.engine.WorkflowBase(name=None, base_dir=None, **kwargs)

Bases: object

Define common attributes and functions for workflows and nodes

__init__(name=None, base_dir=None, **kwargs)

Initialize base parameters of a workflow or node

Parameters :

base_dir : string

base output directory (will be hashed before creations) default=None, which results in the use of mkdtemp

name : string (mandatory)

Name of this node. Name must be alphanumeric and not contain any special characters (e.g., ‘.’, ‘@’).

clone(name)

Clone a workflowbase object

Parameters :

name : string (mandatory)

A clone of node or workflow must have a new name

fullname
inputs
load(filename)
outputs
save(filename=None)