Giter VIP home page Giter VIP logo

cpac's People

Contributors

carolfrohlich avatar ccraddock avatar chrisgorgo avatar jdkent avatar lamoglia avatar pre-commit-ci[bot] avatar remi-gau avatar sgiavasis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

cpac's Issues

test_config not working giving error

Hi,

I'm using the latest bids/cpac and I get the following error when running test_config using docker and singularity:

#### Running C-PAC
Number of participants to run in parallel: 1
Input directory: /bids_dataset
Output directory: /output/output
Working directory: /scratch/working
Crash directory: /output/crash
Log directory: /output/log
Remove working directory: True
Available memory: 6.0 (GB)
Available threads: 1
Number of threads for ANTs: 1
args.aws_data_input_creds None
Traceback (most recent call last):
  File "/code/run.py", line 338, in <module>
    config_dir="/scratch/")
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/utils/build_data_config.py", line 882, in get_BIDS_data_dct
    sites_dct=sites_subs_dct)
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/utils/build_data_config.py", line 1377, in get_nonBIDS_data
    raise Exception(err)
Exception: 

[!] No anatomical input file paths found given the data settings provided.

Anatomical file template being used: /bids_dataset/*/sub-*/anat/sub-*_T1w.nii.gz

It seems it wants to force to find the session folder but my data does not have one because it is only one session.

The culprit seems to be the following line in build_data_config.py:

 else:
        # no session level
        data_dct = get_nonBIDS_data(anat, func, file_list=file_list,
                                    anat_scan=anat_scan,
                                    scan_params_dct=scan_params_dct,
                                    fmap_phase_template=fmap_phase,
                                    fmap_mag_template=fmap_mag,
                                    aws_creds_path=aws_creds_path,
                                    inclusion_dct=inclusion_dct,
                                    exclusion_dct=exclusion_dct,
                                    sites_dct=sites_subs_dct)

Thanks

Eduardo

Description of output files?

Hello CPAC users,

Is there a data dictionary or descriptions of the CPAC folders/outputs? Right now, I'm processing some resting state data and have the following folders:

afni_centrality_0_degree afni_centrality_0_eigenvector afni_centrality_0_lfcd afni_centrality_1_degree afni_centrality_1_eigenvector afni_centrality_1_lfcd alff_collect_transforms_0 alff_collect_transforms_1 alff_falff_0 alff_fsl_to_itk_0 alff_fsl_to_itk_1 alff_to_standard_0 alff_to_standard_1 anat_gather_0 anat_mni_ants_register_0 anat_preproc_0 anat_symmetric_mni_ants_register_0 apply_ants_warp_functional_brain_mask_to_standard_0 apply_ants_warp_functional_brain_mask_to_standard_1 apply_ants_warp_functional_to_standard_0 apply_ants_warp_functional_to_standard_1 apply_ants_warp_mean_functional_to_standard_0 apply_ants_warp_mean_functional_to_standard_1 apply_ants_warp_motion_correct_to_standard_0 apply_ants_warp_motion_correct_to_standard_1 centrality_zscore_0 centrality_zscore_1 collect_transforms_functional_brain_mask_to_standard_0 collect_transforms_functional_brain_mask_to_standard_1 collect_transforms_functional_to_standard_0 collect_transforms_functional_to_standard_1 collect_transforms_mean_functional_to_standard_0 collect_transforms_mean_functional_to_standard_1 collect_transforms_motion_correct_to_standard_0 collect_transforms_motion_correct_to_standard_1 d3.js dr_tempreg_maps_files_collect_transforms_0 dr_tempreg_maps_files_collect_transforms_1 dr_tempreg_maps_files_fsl_to_itk_0 dr_tempreg_maps_files_fsl_to_itk_1 dr_tempreg_maps_files_to_standard_0 dr_tempreg_maps_files_to_standard_1 dr_tempreg_maps_stack_collect_transforms_0 dr_tempreg_maps_stack_collect_transforms_1 dr_tempreg_maps_stack_fsl_to_itk_0 dr_tempreg_maps_stack_fsl_to_itk_1 dr_tempreg_maps_stack_to_standard_0 dr_tempreg_maps_stack_to_standard_1 dr_tempreg_maps_zstat_files_collect_transforms_0 dr_tempreg_maps_zstat_files_collect_transforms_1 dr_tempreg_maps_zstat_files_fsl_to_itk_0 dr_tempreg_maps_zstat_files_fsl_to_itk_1 dr_tempreg_maps_zstat_files_to_standard_0 dr_tempreg_maps_zstat_files_to_standard_1 dr_tempreg_maps_zstat_stack_collect_transforms_0 dr_tempreg_maps_zstat_stack_collect_transforms_1 dr_tempreg_maps_zstat_stack_fsl_to_itk_0 dr_tempreg_maps_zstat_stack_fsl_to_itk_1 dr_tempreg_maps_zstat_stack_to_standard_0 dr_tempreg_maps_zstat_stack_to_standard_1 edit_func_0 falff_collect_transforms_0 falff_collect_transforms_1 falff_fsl_to_itk_0 falff_fsl_to_itk_1 falff_to_standard_0 falff_to_standard_1 fristons_parameter_model_0 fsl_to_itk_functional_brain_mask_to_standard_0 fsl_to_itk_functional_brain_mask_to_standard_1 fsl_to_itk_functional_to_standard_0 fsl_to_itk_functional_to_standard_1 fsl_to_itk_mean_functional_to_standard_0 fsl_to_itk_mean_functional_to_standard_1 fsl_to_itk_motion_correct_to_standard_0 fsl_to_itk_motion_correct_to_standard_1 func_gather_0 func_preproc_automask_0 func_to_anat_bbreg_0 func_to_anat_FLIRT_0 gen_motion_stats_0 graph1.json graph_detailed.dot graph.dot graph.json index.html log_alff_falff_0 log_alff_to_standard_smooth_0 log_alff_to_standard_smooth_1 log_anat_mni_ants_register_0 log_anat_preproc_0 log_anat_symmetric_mni_ants_register_0 log_apply_ants_warp_functional_brain_mask_to_standard_0 log_apply_ants_warp_functional_brain_mask_to_standard_1 log_apply_ants_warp_functional_to_standard_0 log_apply_ants_warp_functional_to_standard_1 log_apply_ants_warp_mean_functional_to_standard_0 log_apply_ants_warp_mean_functional_to_standard_1 log_apply_ants_warp_motion_correct_to_standard_0 log_apply_ants_warp_motion_correct_to_standard_1 log_dr_tempreg_maps_stack_smooth_0 log_dr_tempreg_maps_stack_smooth_1 log_falff_to_standard_smooth_0 log_falff_to_standard_smooth_1 log_frequency_filter_0 log_fristons_parameter_model_0 log_func_preproc_automask_0 log_gen_motion_stats_0 log_motion_correct_to_standard_smooth_0 log_motion_correct_to_standard_smooth_1 log_network_centrality_smooth_0 log_network_centrality_smooth_1 log_nuisance_0 log_reho_0 log_reho_1 log_reho_to_standard_smooth_0 log_reho_to_standard_smooth_1 log_roi_timeseries_0 log_roi_timeseries_1 log_seg_preproc_0 log_spatial_map_timeseries_0 log_spatial_map_timeseries_1 log_spatial_map_timeseries_for_DR_0 log_spatial_map_timeseries_for_DR_1 log_temporal_dual_regression_0 log_temporal_dual_regression_1 log_vmhc_0 log_vmhc_1 nuisance_0 process_outputs_10 process_outputs_101 process_outputs_102 process_outputs_103 process_outputs_104 process_outputs_105 process_outputs_106 process_outputs_107 process_outputs_108 process_outputs_109 process_outputs_11 process_outputs_110 process_outputs_111 process_outputs_112 process_outputs_113 process_outputs_114 process_outputs_115 process_outputs_12 process_outputs_13 process_outputs_14 process_outputs_15 process_outputs_150 process_outputs_151 process_outputs_16 process_outputs_17 process_outputs_18 process_outputs_183 process_outputs_184 process_outputs_185 process_outputs_186 process_outputs_190 process_outputs_191 process_outputs_4 process_outputs_5 process_outputs_52 process_outputs_53 process_outputs_6 process_outputs_7 process_outputs_8 process_outputs_85 process_outputs_86 process_outputs_87 process_outputs_88 process_outputs_9 process_outputs_92 process_outputs_93 reho_0 reho_1 reho_collect_transforms_0 reho_collect_transforms_1 reho_fsl_to_itk_0 reho_fsl_to_itk_1 reho_to_standard_0 reho_to_standard_1 roi_dataflow_0 roi_dataflow_1 roi_timeseries_0 roi_timeseries_1 _scan_task-movieDM _scan_task-movieTP _scan_task-peer_run-1 _scan_task-peer_run-2 _scan_task-peer_run-3 _scan_task-resting_run-1 _scan_task-resting_run-2 seg_preproc_0 sinker_10 sinker_101 sinker_102 sinker_103 sinker_104 sinker_105 sinker_106 sinker_107 sinker_108 sinker_109 sinker_11 sinker_110 sinker_111 sinker_112 sinker_113 sinker_114 sinker_115 sinker_12 sinker_13 sinker_14 sinker_15 sinker_150 sinker_151 sinker_16 sinker_17 sinker_18 sinker_183 sinker_184 sinker_185 sinker_186 sinker_190 sinker_191 sinker_4 sinker_5 sinker_52 sinker_53 sinker_6 sinker_7 sinker_8 sinker_85 sinker_86 sinker_87 sinker_88 sinker_9 sinker_92 sinker_93 spatial_map_dataflow_0 spatial_map_dataflow_1 spatial_map_dataflow_for_DR_0 spatial_map_dataflow_for_DR_1 spatial_map_timeseries_0 spatial_map_timeseries_1 spatial_map_timeseries_for_DR_0 spatial_map_timeseries_for_DR_1 temporal_dual_regression_0 temporal_dual_regression_1 vmhc_0 vmhc_1

I can determine what most of those outputs are, but I wondered if there was a basic document that outlined them (and how they connected to the measures noted on the first cpac github page, e.g., alff, ReHo, DC, EC, SCA, etc.).

Thanks much,
Jamie.

spikeThreshold should either be a float or a string with a percentage sign

Minor mistype in the default_pipeline.yaml and the test_pipeline.yaml

the spike threshold should either be in millimeters:
spikeThreshold : [0.5]
or a percent:
spikeThreshold : ['5%']

I will submit a pull request shortly to resolve this.

Thank you for the useful error messages!

File: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/crash/crash-20171216-042910-jdkent-calc_spike_percent.c0.a0-099be92d-949a-4851-9772-a05729e11afb.pklz
Node: resting_preproc_sub-controlGE140_ses-post.gen_motion_stats_0.calc_spike_percent.c0.a0
Working directory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/working/resting_preproc_sub-controlGE140_ses-post/gen_motion_stats_0/_scan_task-flanker/_threshold_0.5/calc_spike_percent


Node inputs:

fd_file = /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/working/resting_preproc_sub-controlGE140_ses-post/gen_motion_stats_0/_scan_task-flanker/calculate_FDJ/FD_J.1D
function_str = def calc_percent(threshold, fd_file):
    """Calculate the de-spiking/scrubbing threshold based on the highest Mean
    FD values by some percentage.

    :param threshold: user's threshold input, either a float or string
    :param fd_file: text file containing the mean framewise displacement
    :return: a float value for the calculated threshold
    """

    if isinstance(threshold, str):
        if '%' in threshold:
            percent = int(threshold.replace('%', ''))
            percent = percent / 100.0
        else:
            err = "A string was entered for the de-spiking/scrubbing " \
                  "threshold, but there is no percent value."
            raise Exception(err)
    elif isinstance(threshold, float) or isinstance(threshold, int):
        return threshold
    else:
        err = "Invalid input for the de-spiking/scrubbing threshold."
        raise Exception(err)

    with open(fd_file, 'r') as f:
        nums = sorted([float(x.rstrip('\n')) for x in f.readlines()])

    # get the threshold value at the top percent mark provided
    threshold = nums[int(0-(len(nums) * percent))]

    return threshold

ignore_exception = False
threshold = 0.5



Traceback: 
Traceback (most recent call last):
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 52, in run_node
    result['result'] = node.run(updatehash=updatehash)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 372, in run
    self._run_interface()
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 482, in _run_interface
    self._result = self._run_command(execute)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/nodes.py", line 613, in _run_command
    result = self._interface.run()
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/base.py", line 1081, in run
    runtime = self._run_wrapper(runtime)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/base.py", line 1029, in _run_wrapper
    runtime = self._run_interface(runtime)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/utility/wrappers.py", line 194, in _run_interface
    out = function_handle(**args)
  File "<string>", line 17, in calc_percent
Exception: A string was entered for the de-spiking/scrubbing threshold, but there is no percent value.
Interface Function failed to run. 

Can't see GUI in Ubuntu

I have Linux Ubuntu 16.04 LTS Xenial. It seems I cannot get the GUI even with the instructions given. Any help is appreciated.

Here is the code and the error:

sudo docker run -i --rm --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix -v /tmp:/scratch -v /media/egarza/INP_MRI_Backup/projects/INP/addimex_tms/data/mri/nifti:/bids_dataset -v /media/egarza/INP_MRI_Backup/projects/INP/addimex_tms/data/mri/outputs:/outputs bids/cpac /bids_dataset /outputs GUI No protocol specified Namespace(analysis_level='GUI', aws_input_creds=None, aws_output_creds=None, bids_dir='/bids_dataset', data_config_file=None, mem_gb=None, mem_mb=None, n_cpus='1', output_dir='/outputs', participant_label=None, participant_ndx=None, pipeline_file='/cpac_resources/default_pipeline.yaml', save_working_dir=False) Starting CPAC GUI Unable to access the X Display, is $DISPLAY set properly?

BIDS session organization ignored

According to the documentation a data configuration file is not needed if data is in BIDS.

 --data_config_file DATA_CONFIG_FILE
                        Yaml file containing the location of the data that is
                        to be processed. Can be generated from the CPAC gui.
                        This file is not necessary if the data in bids_dir is
                        organized according to the BIDS format. This enables
                        support for legacy data organization and cloud based
                        storage. A bids_dir must still be specified when using
                        this option, but its value will be ignored.

However, if a participant has sub-01/ses-1, sub-01/ses-2 this information is not carried through. Instead all outputs are tagged with the labels session_1. I'm guessing nothing short of providing a full configuration file can fix this for now.

Installing AFNI using CPAC install script

Hi Cameron,

I am trying to use the CPAC install script to install FSL and AFNI to build my docker. It worked fine during the sprint, and I notice the download locations for the installer script has been changed from nih.gov to amazonws and which seems to fail also.

Did you guys try to build the Dockerfile in this repo recently? Does it work for you now? As the URLs and host servers locations are changing, is there a way to make it static or you have other suggestions to make this easier and less reliant on an internet download? Thanks.

cc @chrisfilo

All required system dependencies are installed.
Installing AFNI.
--2016-10-15 15:25:04--  http://fcp-indi.s3.amazonaws.com/resources/cc_afni_trusty_openmp_64.tar.gz
Resolving fcp-indi.s3.amazonaws.com (fcp-indi.s3.amazonaws.com)... 52.216.16.80
Connecting to fcp-indi.s3.amazonaws.com (fcp-indi.s3.amazonaws.com)|52.216.16.80|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2016-10-15 15:25:05 ERROR 404: Not Found.

BIDS: session autoset to "1"

Thanks for adding this BIDS-App, we are excited to get it working.

BACKGROUND
I'm trying to get this working on our cluster which has singularity (v2.2.1) installed. I've built the singularity image today (after 3/4 of the commits applied today to this repository), and it appears to start running okay, but after an hour or so errors out.

QUESTION
Why does CPAC think my subject label is fmriprep+controlGE140 when it's actually controlGE140 and my session label is 1 when it's actually "pre" or "post"?

This is the test-job script for our SGE cluster:

#!/bin/bash

#$ -pe smp 16
#$ -q UI
#$ -m bea
#$ -M [email protected]
#$ -o /Shared/vosslabhpc/Projects/PACR-AD/Imaging/BIDS/derivatives/code/cpac/out
#$ -e /Shared/vosslabhpc/Projects/PACR-AD/Imaging/BIDS/derivatives/code/cpac/err

singularity run -H ${HOME}/singularity_home -B /Shared/vosslabhpc:/mnt \
/Shared/vosslabhpc/UniversalSoftware/SingularityContainers/bids_cpac-2017-12-06-f45fd0b5142f.img \
/mnt/Projects/PACR-AD/Imaging/BIDS /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac \
participant --n_cpus 16 --mem_gb 32 --save_working_dir \
--participant_label controlGE140

This is stdout (minus the bids-validation):

#### Running C-PAC on ['controlGE140']
Number of participants to run in parallel: 1
Input directory: /mnt/Projects/PACR-AD/Imaging/BIDS
Output directory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/output
Working directory: /scratch/working
Crash directory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/crash
Log directory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/log
Remove working directory: True
Available memory: 32.0 (GB)
Available threads: 16
Number of threads for ANTs: 4
sub-fmriprep+controlGE140 ses-1 is missing either an anat or rest (or both)

This is the stderr:

Traceback (most recent call last):
  File "/code/run.py", line 280, in <module>
    import CPAC
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/CPAC/__init__.py", line 21, in <module>
    import anat_preproc, \
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/CPAC/anat_preproc/__init__.py", line 1, in <module>
    from anat_preproc import create_anat_preproc
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/CPAC/anat_preproc/anat_preproc.py", line 1, in <module>
    from nipype.interfaces.afni import preprocess
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/__init__.py", line 49, in <module>
    from .pipeline import Node, MapNode, JoinNode, Workflow
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/__init__.py", line 10, in <module>
    from .engine import Node, MapNode, JoinNode, Workflow
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/__init__.py", line 12, in <module>
    from .workflows import Workflow
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/workflows.py", line 41, in <module>
    from ...interfaces.base import (traits, InputMultiPath, CommandLine,
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/__init__.py", line 12, in <module>
    from .io import DataGrabber, DataSink, SelectFiles
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/io.py", line 38, in <module>
    from ..utils.filemanip import copyfile, list_to_filename, filename_to_list
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/utils/filemanip.py", line 266, in <module>
    _cifs_table = _generate_cifs_table()
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/utils/filemanip.py", line 259, in _generate_cifs_table
    reverse=True)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/utils/filemanip.py", line 258, in <lambda>
    key=lambda x: len(x[0]),
IndexError: list index out of range

This is data config yml that was generated

- anat: /mnt/Projects/PACR-AD/Imaging/BIDS/sub-controlGE140/ses-pre/anat/sub-controlGE140_ses-pre_T1w.nii.gz
  creds_path: null
  rest: {task-flanker: /mnt/Projects/PACR-AD/Imaging/BIDS/sub-controlGE140/ses-pre/func/sub-controlGE140_ses-pre_task-flanker_bold.nii.gz,
    task-rest: /mnt/Projects/PACR-AD/Imaging/BIDS/sub-controlGE140/ses-pre/func/sub-controlGE140_ses-pre_task-rest_bold.nii.gz}
  site_id: site-none
  subject_id: sub-controlGE140
  unique_id: ses-pre
- anat: /mnt/Projects/PACR-AD/Imaging/BIDS/sub-controlGE140/ses-post/anat/sub-controlGE140_ses-post_T1w.nii.gz
  creds_path: null
  rest: {task-flanker: /mnt/Projects/PACR-AD/Imaging/BIDS/sub-controlGE140/ses-post/func/sub-controlGE140_ses-post_task-flanker_bold.nii.gz,
    task-rest: /mnt/Projects/PACR-AD/Imaging/BIDS/sub-controlGE140/ses-post/func/sub-controlGE140_ses-post_task-rest_bold.nii.gz}
  site_id: site-none
  subject_id: sub-controlGE140
  unique_id: ses-post

This is the pipeline config that was generated:

FSLDIR: /usr/share/fsl/5.0
PRIORS_CSF: $priors_path/avg152T1_csf_bin.nii.gz
PRIORS_GRAY: $priors_path/avg152T1_gray_bin.nii.gz
PRIORS_WHITE: $priors_path/avg152T1_white_bin.nii.gz
Regressors:
- {compcor: 1, csf: 1, global: 1, gm: 0, linear: 1, motion: 1, pc1: 0, quadratic: 1,
  wm: 0}
- {compcor: 1, csf: 1, global: 0, gm: 0, linear: 1, motion: 1, pc1: 0, quadratic: 1,
  wm: 0}
TR: None
already_skullstripped: [0]
awsOutputBucketCredentials: null
boundaryBasedRegistrationSchedule: /usr/share/fsl/5.0/etc/flirtsch/bbr.sch
clusterSize: 27
configFileTwomm: $FSLDIR/etc/flirtsch/T1_2_MNI152_2mm.cnf
crashLogDirectory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/crash
degCorrelationThreshold: 0.001
degCorrelationThresholdOption: [Sparsity threshold]
degWeightOptions: [true, true]
dilated_symmetric_brain_mask: $FSLDIR/data/standard/MNI152_T1_${resolution_for_anat}_brain_mask_symmetric_dil.nii.gz
eigCorrelationThreshold: 0.001
eigCorrelationThresholdOption: [Sparsity threshold]
eigWeightOptions: [false, true]
fdCalc: [Jenkinson]
fnirtConfig: T1_2_MNI152_2mm
func_reg_input: [Mean Functional]
func_reg_input_volume: 0
functionalMasking: [3dAutoMask]
fwhm: [6]
highPassFreqALFF: [0.01]
identityMatrix: /usr/share/fsl/5.0/etc/flirtsch/ident.mat
lateral_ventricles_mask: /usr/share/fsl/5.0/data/atlases/HarvardOxford/HarvardOxford-lateral-ventricles-thr25-2mm.nii.gz
lfcdCorrelationThreshold: 0.6
lfcdCorrelationThresholdOption: [Correlation threshold]
lfcdWeightOptions: [true, true]
logDirectory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/log
lowPassFreqALFF: [0.1]
maxCoresPerParticipant: 16
maximumMemoryPerParticipant: 32.0
memoryAllocatedForDegreeCentrality: 1.0
modelConfigs: []
mrsNorm: true
nComponents: [5]
nuisanceBandpassFreq:
- [0.01, 0.1]
numGPAModelsAtOnce: 1
numParticipantsAtOnce: 1
numRemovePrecedingFrames: 1
numRemoveSubsequentFrames: 2
num_ants_threads: 4
outputDirectory: /mnt/Projects/PACR-AD/Imaging/BIDS/derivatives/cpac/output
parallelEnvironment: mpi_smp
pipelineName: analysis
priors_path: /usr/share/fsl/5.0/data/standard/tissuepriors/2mm
queue: all.q
reGenerateOutputs: false
ref_mask: /usr/share/fsl/5.0/data/standard/MNI152_T1_${resolution_for_anat}_brain_mask_symmetric_dil.nii.gz
regOption: [ANTS]
regWithSkull: [1]
removeWorkingDir: true
resolution_for_anat: 2mm
resolution_for_func_derivative: 3mm
resolution_for_func_preproc: 3mm
resourceManager: SGE
roiTSOutputs: [true, true]
runALFF: [1]
runBBReg: [1]
runFrequencyFiltering: [1, 0]
runFristonModel: [1]
runMedianAngleCorrection: [0]
runMotionSpike: ['Off']
runNetworkCentrality: [1]
runNuisance: [1]
runOnGrid: false
runROITimeseries: [1]
runReHo: [1]
runRegisterFuncToAnat: [1]
runRegisterFuncToMNI: [1]
runSCA: [1]
runScrubbing: [0]
runSegmentationPreprocessing: [1]
runSymbolicLinks: [0]
runVMHC: [1]
runZScoring: [0]
s3Encryption: [0]
sca_roi_paths:
- {/cpac_resources/cpac_templates/PNAS_Smith09_rsn10.nii.gz: DualReg}
scrubbingThreshold: [0.2]
slice_timing_correction: [1]
slice_timing_pattern: [Use NIFTI Header]
spikeThreshold: ['0.5']
startIdx: 4
stopIdx: None
targetAngleDeg: [90]
templateSpecificationFile: /cpac_resources/cpac_templates/Mask_ABIDE_85Percent_GM.nii.gz
template_brain_only_for_anat: /usr/share/fsl/5.0/data/standard/MNI152_T1_${resolution_for_anat}_brain.nii.gz
template_brain_only_for_func: /usr/share/fsl/5.0/data/standard/MNI152_T1_${resolution_for_func_preproc}_brain.nii.gz
template_skull_for_anat: /usr/share/fsl/5.0/data/standard/MNI152_T1_${resolution_for_anat}.nii.gz
template_skull_for_func: /usr/share/fsl/5.0/data/standard/MNI152_T1_${resolution_for_func_preproc}.nii.gz
template_symmetric_brain_only: $FSLDIR/data/standard/MNI152_T1_${resolution_for_anat}_brain_symmetric.nii.gz
template_symmetric_skull: $FSLDIR/data/standard/MNI152_T1_${resolution_for_anat}_symmetric.nii.gz
tsa_roi_paths:
- {/cpac_resources/cpac_templates/CC200.nii.gz: Avg, /cpac_resources/cpac_templates/CC400.nii.gz: Avg,
  /cpac_resources/cpac_templates/PNAS_Smith09_rsn10.nii.gz: SpatialReg, /cpac_resources/cpac_templates/aal_mask_pad.nii.gz: Avg,
  /cpac_resources/cpac_templates/ez_mask_pad.nii.gz: Avg, /cpac_resources/cpac_templates/ho_mask_pad.nii.gz: Avg,
  /cpac_resources/cpac_templates/rois_3mm.nii.gz: Avg, /cpac_resources/cpac_templates/tt_mask_pad.nii.gz: Avg}
workingDirectory: /scratch/working

and this is how the example subject is organized:

sub-controlGE140
├── ses-post
│   ├── anat
│   │   ├── sub-controlGE140_ses-post_acq-1_T2w.nii.gz
│   │   ├── sub-controlGE140_ses-post_acq-2_T2w.nii.gz
│   │   └── sub-controlGE140_ses-post_T1w.nii.gz
│   ├── cbf
│   │   ├── sub-controlGE140_ses-post_acq-ASL_cbf.nii.gz
│   │   └── sub-controlGE140_ses-post_acq-cbf_cbf.nii.gz
│   ├── dwi
│   │   ├── sub-controlGE140_ses-post_acq-60D_dwi.bval
│   │   ├── sub-controlGE140_ses-post_acq-60D_dwi.bvec
│   │   ├── sub-controlGE140_ses-post_acq-60D_dwi.nii.gz
│   │   ├── sub-controlGE140_ses-post_acq-B0_dwi.bval
│   │   ├── sub-controlGE140_ses-post_acq-B0_dwi.bvec
│   │   └── sub-controlGE140_ses-post_acq-B0_dwi.nii.gz
│   ├── fmap
│   │   ├── sub-controlGE140_ses-post_fieldmap.json
│   │   ├── sub-controlGE140_ses-post_fieldmap.nii.gz
│   │   └── sub-controlGE140_ses-post_magnitude.nii.gz
│   └── func
│       ├── sub-controlGE140_ses-post_task-flanker_bold.nii.gz
│       ├── sub-controlGE140_ses-post_task-flanker_events.tsv
│       └── sub-controlGE140_ses-post_task-rest_bold.nii.gz
└── ses-pre
    ├── anat
    │   ├── sub-controlGE140_ses-pre_acq-1_T2w.nii.gz
    │   ├── sub-controlGE140_ses-pre_acq-2_T2w.nii.gz
    │   └── sub-controlGE140_ses-pre_T1w.nii.gz
    ├── cbf
    │   ├── sub-controlGE140_ses-pre_acq-ASL_cbf.nii.gz
    │   └── sub-controlGE140_ses-pre_acq-cbf_cbf.nii.gz
    ├── dwi
    │   ├── sub-controlGE140_ses-pre_acq-60D_dwi.bval
    │   ├── sub-controlGE140_ses-pre_acq-60D_dwi.bvec
    │   ├── sub-controlGE140_ses-pre_acq-60D_dwi.bvece
    │   ├── sub-controlGE140_ses-pre_acq-60D_dwi.nii.gz
    │   ├── sub-controlGE140_ses-pre_acq-B0_dwi.bval
    │   ├── sub-controlGE140_ses-pre_acq-B0_dwi.bvec
    │   └── sub-controlGE140_ses-pre_acq-B0_dwi.nii.gz
    ├── fmap
    │   ├── sub-controlGE140_ses-pre_fieldmap.json
    │   ├── sub-controlGE140_ses-pre_fieldmap.nii.gz
    │   └── sub-controlGE140_ses-pre_magnitude.nii.gz
    └── func
        ├── outpt.txt
        ├── sub-controlGE140_ses-pre_task-flanker_bold.nii.gz
        ├── sub-controlGE140_ses-pre_task-flanker_events.tsv
        └── sub-controlGE140_ses-pre_task-rest_bold.nii.gz

12 directories, 36 files

There are a couple top level json files too in the BIDS directory:
fieldmap.json
task-flanker_bold.json
task-rest_bold.json

POSSIBLY RELEVANT
I see that the session variable gets defined in bids_utils.py, but I can't readily parse how f_dict you pull bids session information.

Docker image GUI on Mac

Hi!

I tried to run the GUI with the docker image on Mac using the instructions on GitHub. XQuartz was already installed, I authorized the connections from network clients, added the IP address to the xhost, and finally, I tried to run the command:
docker run -i --rm \ --privileged \ -e DISPLAY=$ip:0 \ -v /tmp/.X11-unix:/tmp/.X11-unix \ -v /tmp:/scratch \ -v /Users/filo/data/ds005:/bids_dataset \ -v /Users/filo/outputs:/outputs \ bids/cpac \ /bids_dataset /outputs GUI

changing the directories with mine.

I got the following error:
No protocol specified Namespace(analysis_level='gui', aws_input_creds=None, aws_output_creds=None, bids_dir='/bids_dataset', bids_validator_config=None, data_config_file=None, disable_file_logging=False, mem_gb=None, mem_mb=None, n_cpus='1', output_dir='/outputs', participant_label=None, participant_ndx=None, pipeline_file='/cpac_resources/default_pipeline.yaml', save_working_dir=False, skip_bids_validator=False) Starting CPAC GUI Unable to access the X Display, is $DISPLAY set properly?

The XQuartz display doesn't seem to like the ip address, I also tried directly with the $DISPLAY variable but it produces the same error.

Thank you in advance for your help,
Chris.

The pipeline is not being run

At the moment the run script sets up the pipeline but does not run it. I'm sure that this is just work in progress, but I am leaving this issue here for the record.

CPAC runs, but no Reports

It is normal not to get the HTML reports in the logs folder? I ran a participant and it finished without errors, with everything I needed in the output, but the report doesn't show anything.

Thanks

Ed

Can't save Data Config file with Singularity GUI

This is a different issue than the other I posted.

Now I can easily run CPAC GUI using singularity without problems. However, when I try to save the BIDS formatted dataset I have, it gives me the following error:

Saving data settings file:
/media/egarza/INP_MRI_Backup/projects/INP/addimex_tms/data/mri/test/data_settings_test4.yaml


Generating data configuration file..
Checking participants.tsv file for site information:
/media/egarza/INP_MRI_Backup/projects/INP/addimex_tms/data/mri/bids_sofia/participants.tsv
No site information found in the participants.tsv file.
Traceback (most recent call last):
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/GUI/interface/windows/dataconfig_window.py", line 349, in <lambda>
    self.Bind(wx.EVT_BUTTON, lambda event: self.save(event,'run'), id=ID_RUN_EXT)
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/GUI/interface/windows/dataconfig_window.py", line 604, in save
    if self.run(path) > 0:
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/GUI/interface/windows/dataconfig_window.py", line 388, in run
    CPAC.utils.build_data_config.run(config)
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/utils/build_data_config.py", line 1534, in run
    config_dir=settings_dct["outputSubjectListLocation"])
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/utils/build_data_config.py", line 871, in get_BIDS_data_dct
    sites_dct=sites_subs_dct)
  File "/usr/local/miniconda/lib/python2.7/site-packages/CPAC/utils/build_data_config.py", line 1377, in get_nonBIDS_data
    raise Exception(err)
Exception: 

[!] No anatomical input file paths found given the data settings provided.

Anatomical file template being used: /media/egarza/INP_MRI_Backup/projects/INP/addimex_tms/data/mri/bids_sofia/*/sub-*/ses-*/anat/sub-*_ses-*_T1w.nii.gz


(run.py:13131): Gtk-WARNING **: Unable to find default local file monitor type

We do have a participants.tsv. It seems the anatomical path is looking for a folder before '/sub-*/' which should not exists.

Thanks

Ed

Tests time out

CircleCI has a limit of running no longer than 2h. This seems to be too short to perform full analysis of one subject. Would it be possible to limit the analysis for the testing purposes only to a couple of first testing steps? This is what I did for FreeSurfer and HCPPipelines apps.

Docker compilation issues

I haven't been able to successfully build Docker containers of recent updates. In fact not since the update 6-7 months ago eliminated dependence on install_cpac.sh to build containers.

The most recent v1.1.0_9 errors due to failure to install numpy 1.11 dependencies on blas.
See errors here

Any suggestions on how to fix this?

trusty version not showing mount correctly

Setup:

Singularity container on CENTOS 7 server as the entry to a HPC cluster.

Behavior:

singularity shell jdkent_cpac_latest-2018-01-24-be00acf97a2f.img 
Singularity: Invoking an interactive shell within container...

Singularity.jdkent_cpac_latest-2018-01-24-be00acf97a2f.img> $ python
Python 2.7.13 |Continuum Analytics, Inc.| (default, Dec 20 2016, 23:09:15) 
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
>>> import nipype
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/__init__.py", line 49, in <module>
    from .pipeline import Node, MapNode, JoinNode, Workflow
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/__init__.py", line 10, in <module>
    from .engine import Node, MapNode, JoinNode, Workflow
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/__init__.py", line 12, in <module>
    from .workflows import Workflow
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/pipeline/engine/workflows.py", line 41, in <module>
    from ...interfaces.base import (traits, InputMultiPath, CommandLine,
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/__init__.py", line 12, in <module>
    from .io import DataGrabber, DataSink, SelectFiles
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/interfaces/io.py", line 38, in <module>
    from ..utils.filemanip import copyfile, list_to_filename, filename_to_list
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/utils/filemanip.py", line 266, in <module>
    _cifs_table = _generate_cifs_table()
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/utils/filemanip.py", line 259, in _generate_cifs_table
    reverse=True)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/nipype/utils/filemanip.py", line 258, in <lambda>
    key=lambda x: len(x[0]),
IndexError: list index out of range

The error is coming from the mount command

Example

Singularity.jdkent_cpac_latest-2018-01-24-be00acf97a2f.img> $ mount
singularity on / type rootfs (rw)

mount: warning: /etc/mtab is not writable (e.g. read-only filesystem).
       It's possible that information reported by mount(8) is not
       up to date. For actual information about system mount points
       check the /proc/mounts file.

While the input expected is just singularity on / type rootfs (rw): see nipype code

Parallel runs try to write to the same file

When running multiple C-PAC instances (one for each participant) on the same host they try to concurrently write to the same log file

Traceback (most recent call last):
  File "/code/run.py", line 289, in <module>
    plugin='MultiProc', plugin_args=plugin_args)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/CPAC/pipeline/cpac_runner.py", line 465, in run
    create_group_log_template(sub_scan_map, c.logDirectory)
  File "/usr/local/bin/miniconda/lib/python2.7/site-packages/CPAC/utils/utils.py", line 1866, in create_group_log_template
    os.makedirs(reportdir)
  File "/usr/local/bin/miniconda/lib/python2.7/os.py", line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 17] File exists: '/output/data/log/reports'

Possible solution would be to use UUID or participant label to write the log to a unique file for each run.

'Configuration' object has no attribute 'disable_log'

I have a simple configuration Pipeline_config.txt to run CPAC and I am getting 'Configuration' object has no attribute 'disable_log' error. I have tried different configuration options but not able to find how to fix it. Can you please help?

Here is the log
180130-20:12:37,959 workflow INFO:
VERSION: CPAC 1.0.2

Setting maximum number of cores per participant to 1
Setting number of participants at once to 1
Setting OMP_NUM_THREADS to 1
Setting MKL_NUM_THREADS to 1
Setting ANTS/ITK thread usage to 1

Maximum potential number of cores that might be used during this run: 1

++ 3dcalc: AFNI version=AFNI_16.3.08 (Nov 4 2016) [64-bit]
++ Authored by: A cast of thousands
Process Process-7:
Traceback (most recent call last):
File "/usr/local/bin/miniconda/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/local/bin/miniconda/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/bin/miniconda/lib/python2.7/site-packages/CPAC/pipeline/cpac_pipeline.py", line 291, in prep_workflow
if c.disable_log and c.disable_log == True:
AttributeError: 'Configuration' object has no attribute 'disable_log'

Derivatives in BIDS format?

Are there any plans in the works to output the derivatives such as regional timeseries or the nuisance regresssors used according to BIDS formatting?

Installing cpac_install.sh failure

root@ubuntu:~# sudo ./cpac_install.sh -r
Installing the C-PAC ecosystem system-wide on UBUNTU with -r
Installing C-PAC system dependencies... [cmake git graphviz graphviz-dev gsl-bin libexpat1-dev libgiftiio-dev libglib2.0-dev libglu1-mesa-dev libjpeg-progs libxml2-dev libxext-dev libxft-dev libxi-dev libxmu-headers libxmu-dev libxpm-dev libxslt1-dev mesa-common-dev mesa-utils netpbm build-essential xvfb libgl1-mesa-dri tcsh zlib1g-dev m4 libmotif-dev libxp-dev libgsl0-dev][30]
Ign cdrom://Ubuntu 14.04.5 LTS Trusty Tahr - Release amd64 (20160803) trusty InRelease
Ign cdrom://Ubuntu 14.04.5 LTS Trusty Tahr - Release amd64 (20160803) trusty/main Translation-en_US
Ign cdrom://Ubuntu 14.04.5 LTS Trusty Tahr - Release amd64 (20160803) trusty/main Translation-en
Ign cdrom://Ubuntu 14.04.5 LTS Trusty Tahr - Release amd64 (20160803) trusty/restricted Translation-en_US
Ign cdrom://Ubuntu 14.04.5 LTS Trusty Tahr - Release amd64 (20160803) trusty/restricted Translation-en
Ign http://archive.ubuntu.com trusty InRelease
Get:1 http://archive.ubuntu.com trusty-updates InRelease [65.9 kB]
Hit http://archive.ubuntu.com trusty Release.gpg
Get:2 http://archive.ubuntu.com trusty-updates/main amd64 Packages [1,119 kB]
Get:3 http://security.ubuntu.com trusty-security InRelease [65.9 kB]
Get:4 http://security.ubuntu.com trusty-security/main amd64 Packages [783 kB]
Get:5 http://archive.ubuntu.com trusty-updates/restricted amd64 Packages [17.2 kB]
Get:6 http://archive.ubuntu.com trusty-updates/main Translation-en [555 kB]
Get:7 http://archive.ubuntu.com trusty-updates/restricted Translation-en [4,021 B]
Hit http://archive.ubuntu.com trusty Release
Hit http://archive.ubuntu.com trusty/main amd64 Packages
Hit http://archive.ubuntu.com trusty/restricted amd64 Packages
Hit http://archive.ubuntu.com trusty/main Translation-en
Hit http://archive.ubuntu.com trusty/restricted Translation-en
Ign http://archive.ubuntu.com trusty/main Translation-en_US
Ign http://archive.ubuntu.com trusty/restricted Translation-en_US
Get:8 http://security.ubuntu.com trusty-security/restricted amd64 Packages [14.2 kB]
Get:9 http://security.ubuntu.com trusty-security/main Translation-en [420 kB]
Get:10 http://security.ubuntu.com trusty-security/restricted Translation-en [3,556 B]
Fetched 3,049 kB in 19s (153 kB/s)
Reading package lists... Done
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages will be upgraded:
wget
1 upgraded, 0 newly installed, 0 to remove and 444 not upgraded.
Need to get 270 kB of archives.
After this operation, 0 B of additional disk space will be used.
Get:1 http://security.ubuntu.com/ubuntu/ trusty-security/main wget amd64 1.15-1ubuntu1.14.04.4 [270 kB]
Fetched 270 kB in 1s (174 kB/s)
(Reading database ... 172284 files and directories currently installed.)
Preparing to unpack .../wget_1.15-1ubuntu1.14.04.4_amd64.deb ...
Unpacking wget (1.15-1ubuntu1.14.04.4) over (1.15-1ubuntu1.14.04.2) ...
Processing triggers for install-info (5.2.0.dfsg.1-2) ...
Processing triggers for man-db (2.6.7.1-1ubuntu1) ...
Setting up wget (1.15-1ubuntu1.14.04.4) ...
Reading package lists... Done
Building dependency tree
Reading state information... Done
Package mesa-utils is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source

E: Unable to locate package gsl-bin
E: Unable to locate package libgiftiio-dev
E: Unable to locate package libjpeg-progs
E: Package 'mesa-utils' has no installation candidate
E: Unable to locate package tcsh
E: Unable to locate package libmotif-dev
libxp is installed via apt for Ubuntu 14.04
[ Fri Nov 9 08:38:10 UTC 2018 ] apt-get failed to install packages: cmake git graphviz graphviz-dev gsl-bin libexpat1-dev libgiftiio-dev libglib2.0-dev libglu1-mesa-dev libjpeg-progs libxml2-dev libxext-dev libxft-dev libxi-dev libxmu-headers libxmu-dev libxpm-dev libxslt1-dev mesa-common-dev mesa-utils netpbm build-essential xvfb libgl1-mesa-dri tcsh zlib1g-dev m4 libmotif-dev libxp-dev libgsl0-dev
Reading package lists... Done
Building dependency tree
Reading state information... Done
0 upgraded, 0 newly installed, 0 to remove and 444 not upgraded.
[ Fri Nov 9 08:38:10 UTC 2018 ] : C-PAC system dependencies not fully installed.
Python dependencies cannot be installed unless system-level dependencies are installed first.
Have your system administrator install system-level dependencies as root.
Exiting now...

Workflow graphs

Hi,

Is it possible to run pipeline preprocessing without having to create workflow graphs? I am constantly getting an error generating the workflow graphs using nipype.

Thank you!

Get slice timings

Perhaps it`s a silly question but, reading the manual it says BIDS needs the precise slice acquisition times to perform slice timing correction. What tool can I use to get the times? Dcm2Bids doesn't do it nor I can get it from the nifti header so I guess it should be on the DICOM header? Or how do you guys do it?

Thanks

Ed

CPAC crashes before completion

Greetings,
I have been running the BIDS-App CPAC docker container image (v1.0.2_disable_log_2) on test participants in two different datasets. CPAC crashes out prior to completion.

The crash error in the pypeline.log file reads:


180330-02:24:47,161 workflow ERROR:
could not run node: resting_preproc_sub-166_ses-1.gen_motion_stats_0.calc_spike_percent.a0.c0
180330-02:24:47,162 workflow INFO:
crashfile: /outputs/crash/crash-20180330-002347-root-calc_spike_percent.a0.c0-8b8c1d4c-4aef-4ba6-a40a-a4b7996d54f7.pklz
180330-02:24:47,163 workflow INFO:


I'm uploading the full pypeline.log file and can send the crash log .pklz file if needed.

Thanks
pypeline.log

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.