Giter VIP home page Giter VIP logo

bids-standard / bids-specification Goto Github PK

View Code? Open in Web Editor NEW
255.0 31.0 151.0 13.36 MB

Brain Imaging Data Structure (BIDS) Specification

Home Page: https://bids-specification.readthedocs.io/

License: Creative Commons Attribution 4.0 International

Shell 0.87% CSS 0.25% TeX 0.49% Python 89.75% Jupyter Notebook 3.53% TypeScript 4.32% Makefile 0.51% Batchfile 0.28%
bids neuroimaging data-standards standards

bids-specification's People

Contributors

adam2392 avatar bendhouseart avatar bids-maintenance avatar choldgraf avatar chrisgorgo avatar dependabot[bot] avatar dimitripapadopoulos avatar effigies avatar eort avatar francopestilli avatar franklin-feingold avatar greydongilmore avatar hoechenberger avatar lestropie avatar mariehbourget avatar melanieganz avatar nellh avatar nicholst avatar oesteban avatar pre-commit-ci[bot] avatar remi-gau avatar rob-luke avatar robertoostenveld avatar rwblair avatar sappelhoff avatar teonbrooks avatar thechymera avatar tsalo avatar wouterpotters avatar yarikoptic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bids-specification's Issues

New milestone: BIDS Extension Proposal 005: ASL

Dear all,

Together with
[email protected] Michael Chappell & [email protected]
Thom Okell & [email protected] Jan Petr & David Thomas & [email protected] John Detre [email protected] Patricia Clement [email protected] Danny Wang [email protected] Catherine Morgan [email protected] Andrew Robertson [email protected] Chris Gorgolewski

we've been working offline on a first draft, I've put this now into this public draft. All comments are welcome! Thanks everyone for contributing. Also, David Thomas kindly provided three example datasets from 1 subject rescanned on the current GE, Philips and Siemens PCASL product sequences.

Best wishes, Henk

make CONTRIBUTING more visible

Github shows the README contents formatted to all visitors of the repository. It is arguably the first place to look when you get here - and currently it looks a bit bleak:
image

The most important information for finding one's way around this repository is actually in https://github.com/bids-standard/bids-specification/blob/master/CONTRIBUTING.md

I think we should:

  • link from the README to the CONTRIBUTING
  • also mention the readthedocs html link in the README (currently it's only displayed in the header above)

Inheritance principle: clarify the procedure of which files would be considered

It might be just me, but from current wording it is not 100% clear if it is only _run and _rec are of special treatment to be "generalized over" or any other possible key/value pair (_acq etc). I.e. for a file

sub-<label>/[ses-<label>/]
    func/
        sub-<label>[_ses-<label>]_task-<label>[_acq-<label>][_ce-<label>][_dir-<label>][_rec-<label>][_run-<index>][_echo-<index>]_<contrast_label>.json

would task-<label>_<contrast_label>.json at the top level be considered, disregarding any possibly present OPTIONAL key (like _acq, _ce, etc)?

What about anatomy, e.g. for a file

sub-<label>/[ses-<label>/]
    anat/
        sub-<label>[_ses-<label>][_acq-<label>][_ce-<label>][_rec-<label>][_run-<index>]_<suffix>.json

would <suffix>.json be considered regardless any of the possible acq, ce etc present in the target subject/session specific file?

To make it totally clear, it might be worth to formulate explicitly a generic rule to generate "higher level" filename considered for inheritance for any given "leaf" file (might be already as implemented in pybids -- didn't check yet):

  • strip sub-<label>_ prefix while providing generalization across subject(s)
  • remove _ses-<label> while providing generalization across session(s)
  • remove any _key-<value> if generalizing across various types of acquisition
  • strip possibly present leading _ if left only with the suffix, such as _<suffix>.json in anat example

what do you think? Or it is just me, and current wording is sufficient (I would be ok with that)

FWIW, here is a list (click to expand) of all interesting "corner cases" across openneuro datasets
$> ls -l */*json | grep -v -e dataset_de -e participa -e task-  
-rw------- 1 yoh yoh   240 Dec  4 15:36 ds000101/T1w.json
-rw------- 1 yoh yoh   247 Dec  4 15:37 ds000102/T1w.json
-rw------- 1 yoh yoh  1372 Dec  4 16:08 ds000117/acq-mprage_T1w.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-1_FLASH.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-2_FLASH.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-3_FLASH.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-4_FLASH.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-5_FLASH.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-6_FLASH.json
-rw------- 1 yoh yoh    82 Dec  4 16:08 ds000117/run-1_echo-7_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-1_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-2_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-3_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-4_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-5_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-6_FLASH.json
-rw------- 1 yoh yoh    83 Dec  4 16:08 ds000117/run-2_echo-7_FLASH.json
-rw------- 1 yoh yoh   146 Dec  4 15:46 ds000144/T1w.json
-rw------- 1 yoh yoh   128 Dec  4 15:47 ds000164/T1w.json
-rw------- 1 yoh yoh   204 Dec  4 15:46 ds000168/T1w.json
-rw------- 1 yoh yoh   288 Dec  4 15:42 ds000174/T1w.json
-rw------- 1 yoh yoh  1184 Dec  4 15:58 ds000201/T1w.json
-rw------- 1 yoh yoh  1127 Dec  4 15:58 ds000201/T2w.json
-rw------- 1 yoh yoh   901 Dec  4 15:59 ds000201/dwi.json
-rw------- 1 yoh yoh   209 Dec  4 15:47 ds000205/T1w.json
-rw------- 1 yoh yoh   207 Dec  4 15:41 ds000208/T1w.json
-rw------- 1 yoh yoh   141 Dec  4 15:49 ds000213/T1w.json
-rw------- 1 yoh yoh   169 Dec  4 15:48 ds000214/T1w.json
-rw------- 1 yoh yoh   226 Dec  4 15:48 ds000222/T1w.json
-rw------- 1 yoh yoh    62 Dec  4 15:48 ds000229/T1w.json
-rw------- 1 yoh yoh   285 Dec  4 15:44 ds000231/T1w.json
-rw------- 1 yoh yoh   408 Dec  4 15:49 ds000239/T1w.json
-rw------- 1 yoh yoh   217 Dec  4 15:44 ds000240/T1w.json
-rw------- 1 yoh yoh    73 Dec  4 16:05 ds000244/dir-0_epi.json
-rw------- 1 yoh yoh    74 Dec  4 16:05 ds000244/dir-1_epi.json
-rw------- 1 yoh yoh   197 Dec  4 16:05 ds000244/dwi.json
-rw------- 1 yoh yoh   252 Dec  4 15:47 ds000245/T1w.json
-rw------- 1 yoh yoh   195 Dec  4 15:42 ds000248/acq-epi_T1w.json
-rw------- 1 yoh yoh    77 Dec  4 15:42 ds000248/acq-flipangle05_run-01_MEFLASH.json
-rw------- 1 yoh yoh    78 Dec  4 15:42 ds000248/acq-flipangle30_run-01_MEFLASH.json
-rw------- 1 yoh yoh   144 Dec  4 15:48 ds000254/T1w.json
-rw------- 1 yoh yoh   176 Dec  4 15:43 ds000255/T1w.json
-rw------- 1 yoh yoh   215 Dec  4 15:55 ds001021/T1w.json
-rw------- 1 yoh yoh   157 Dec  4 15:55 ds001021/dwi.json
-rw------- 1 yoh yoh    54 Dec  4 15:55 ds001021/phasediff.json
-rw------- 1 yoh yoh    69 Dec  4 15:39 ds001105/dir-AP_epi.json
-rw------- 1 yoh yoh    68 Dec  4 15:39 ds001105/dir-PA_epi.json
-rw------- 1 yoh yoh   259 Dec  4 16:01 ds001246/T1w.json
-rw------- 1 yoh yoh   253 Dec  4 16:01 ds001246/inplaneT2.json
-rw------- 1 yoh yoh   977 Dec  4 15:52 ds001386/bold.json
-rw------- 1 yoh yoh    75 Dec  4 15:57 ds001454/phasediff.json
-rw------- 1 yoh yoh   376 Dec  4 16:04 ds001486/T1w.json
-rw------- 1 yoh yoh   247 Dec  4 16:07 ds001525/T1w.json
-rw------- 1 yoh yoh    75 Dec  4 16:09 ds001545/phasediff.json
-rw------- 1 yoh yoh   517 Dec  4 16:16 ds001597/T1w.json
-rw------- 1 yoh yoh   517 Dec  4 16:16 ds001597/T2w.json
A list (click to expand) of all interesting task- "corner cases" where it is not just task-_bold.json across openneuro datasets
$> ls -l */*json | grep -v -e dataset_de -e participa | grep task-.*_[^b]
-rw------- 1 yoh yoh   284 Dec  4 15:47 ds000164/task-stroop_events.json
-rw------- 1 yoh yoh    76 Dec  4 15:59 ds000201/task-hands_physio.json
-rw------- 1 yoh yoh   596 Dec  4 15:48 ds000214/task-Cyberball_events.json
-rw------- 1 yoh yoh   884 Dec  4 15:46 ds000234/task-motorphotic_asl.json
-rw------- 1 yoh yoh   786 Dec  4 15:44 ds000235/task-rest_asl.json
-rw------- 1 yoh yoh   786 Dec  4 15:42 ds000236/task-rest_asl.json
-rw------- 1 yoh yoh   224 Dec  4 15:47 ds000237/task-MemorySpan_acq-multiband_bold.json
-rw------- 1 yoh yoh   934 Dec  4 15:44 ds000240/task-restEyesOpen_asl.json
-rw------- 1 yoh yoh  1628 Dec  4 16:05 ds000244/task-ArchiEmotional_acq-ap_bold.json
-rw------- 1 yoh yoh  1628 Dec  4 16:05 ds000244/task-ArchiEmotional_acq-ap_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ArchiEmotional_acq-pa_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ArchiEmotional_acq-pa_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-ArchiSocial_acq-ap_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-ArchiSocial_acq-ap_sbref.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-ArchiSocial_acq-pa_bold.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-ArchiSocial_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ArchiSpatial_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ArchiSpatial_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-ArchiSpatial_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-ArchiSpatial_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ArchiStandard_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ArchiStandard_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ArchiStandard_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ArchiStandard_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn01_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn01_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn02_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn02_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn03_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn03_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn04_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn04_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn05_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn05_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn06_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn06_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn07_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn07_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn08_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn08_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn09_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn09_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn10_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn10_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn11_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsTrn11_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn12_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsTrn12_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsVal01_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsVal01_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal02_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal02_acq-ap_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal03_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal03_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsVal04_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsVal04_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal05_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal05_acq-ap_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal06_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal06_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsVal07_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-ClipsVal07_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal08_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal08_acq-ap_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal09_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-ClipsVal09_acq-ap_sbref.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-ContRing_acq-ap_bold.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-ContRing_acq-ap_sbref.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-ExpRing_acq-pa_bold.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-ExpRing_acq-pa_sbref.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-HcpEmotion_acq-ap_bold.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-HcpEmotion_acq-ap_sbref.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-HcpEmotion_acq-pa_bold.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-HcpEmotion_acq-pa_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-HcpGambling_acq-ap_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-HcpGambling_acq-ap_sbref.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-HcpGambling_acq-pa_bold.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-HcpGambling_acq-pa_sbref.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-HcpLanguage_acq-ap_bold.json
-rw------- 1 yoh yoh  1624 Dec  4 16:05 ds000244/task-HcpLanguage_acq-ap_sbref.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-HcpLanguage_acq-pa_bold.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-HcpLanguage_acq-pa_sbref.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-HcpMotor_acq-ap_bold.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-HcpMotor_acq-ap_sbref.json
-rw------- 1 yoh yoh  1621 Dec  4 16:05 ds000244/task-HcpMotor_acq-pa_bold.json
-rw------- 1 yoh yoh  1621 Dec  4 16:05 ds000244/task-HcpMotor_acq-pa_sbref.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-HcpRelational_acq-ap_bold.json
-rw------- 1 yoh yoh  1627 Dec  4 16:05 ds000244/task-HcpRelational_acq-ap_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-HcpRelational_acq-pa_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-HcpRelational_acq-pa_sbref.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-HcpSocial_acq-ap_bold.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-HcpSocial_acq-ap_sbref.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-HcpSocial_acq-pa_bold.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-HcpSocial_acq-pa_sbref.json
-rw------- 1 yoh yoh  1619 Dec  4 16:05 ds000244/task-HcpWm_acq-ap_bold.json
-rw------- 1 yoh yoh  1619 Dec  4 16:05 ds000244/task-HcpWm_acq-ap_sbref.json
-rw------- 1 yoh yoh  1618 Dec  4 16:05 ds000244/task-HcpWm_acq-pa_bold.json
-rw------- 1 yoh yoh  1618 Dec  4 16:05 ds000244/task-HcpWm_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage00_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage00_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage00_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage00_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage01_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage01_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage01_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage01_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage02_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage02_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage02_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage02_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage03_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage03_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage03_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage03_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage04_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage04_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage04_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage04_acq-pa_sbref.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage05_acq-ap_bold.json
-rw------- 1 yoh yoh  1626 Dec  4 16:05 ds000244/task-RSVPLanguage05_acq-ap_sbref.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage05_acq-pa_bold.json
-rw------- 1 yoh yoh  1625 Dec  4 16:05 ds000244/task-RSVPLanguage05_acq-pa_sbref.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-WedgeAnti_acq-ap_bold.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-WedgeAnti_acq-ap_sbref.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-WedgeAnti_acq-pa_bold.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-WedgeAnti_acq-pa_sbref.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-WedgeClock_acq-ap_bold.json
-rw------- 1 yoh yoh  1623 Dec  4 16:05 ds000244/task-WedgeClock_acq-ap_sbref.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-WedgeClock_acq-pa_bold.json
-rw------- 1 yoh yoh  1622 Dec  4 16:05 ds000244/task-WedgeClock_acq-pa_sbref.json
-rw------- 1 yoh yoh   843 Dec  4 15:48 ds000254/task-bilateralfingertapping_echo-1_bold.json
-rw------- 1 yoh yoh   842 Dec  4 15:48 ds000254/task-bilateralfingertapping_echo-2_bold.json
-rw------- 1 yoh yoh   843 Dec  4 15:48 ds000254/task-bilateralfingertapping_echo-3_bold.json
-rw------- 1 yoh yoh   843 Dec  4 15:48 ds000254/task-bilateralfingertapping_echo-4_bold.json
-rw------- 1 yoh yoh  2345 Dec  4 15:55 ds001021/task-BREATHHOLD_acq-1400_bold.json
-rw------- 1 yoh yoh  1916 Dec  4 15:55 ds001021/task-CHECKERBOARD_acq-1400_bold.json
-rw------- 1 yoh yoh  1605 Dec  4 15:55 ds001021/task-CHECKERBOARD_acq-645_bold.json
-rw------- 1 yoh yoh  1593 Dec  4 15:55 ds001021/task-rest_acq-1400_bold.json
-rw------- 1 yoh yoh  1314 Dec  4 15:55 ds001021/task-rest_acq-645_bold.json
-rw------- 1 yoh yoh  1134 Dec  4 15:55 ds001021/task-rest_acq-CAP_bold.json
-rw------- 1 yoh yoh    76 Dec  4 16:11 ds001553/task-checkerboard_events.json
-rw------- 1 yoh yoh   869 Dec  4 16:16 ds001597/task-cuedMFM_events.json
-rw------- 1 yoh yoh    25 Dec  4 16:18 ds001600/task-rest_acq-AP_bold.json
-rw------- 1 yoh yoh    25 Dec  4 16:18 ds001600/task-rest_acq-PA_bold.json
-rw------- 1 yoh yoh    25 Dec  4 16:18 ds001600/task-rest_acq-v1_bold.json
-rw------- 1 yoh yoh    25 Dec  4 16:18 ds001600/task-rest_acq-v2_bold.json
-rw------- 1 yoh yoh    25 Dec  4 16:18 ds001600/task-rest_acq-v4_bold.json

edit 1: "The constraint"
another not entirely clear aspect to me, which is the not spelled out, is the requirement to have only a single "applicable" file at any given level, which is demonstrated in Example 1: Two JSON files at same level that are applicable for NIfTI file.:

"violating the constraint that no more than one file may be defined at a given level of the directory structure"
(wording around soon to be tuned up a bit in a https://github.com/bids-standard/bids-specification/pull/98/files#diff-ba564f153b960d803d493fe37fbbb34eL148). I do not see a clear definition of such constraint in the actual text describing inheritance principle. While working on fixing auto aggregation of common fields into the top level files to be inherited within heudiconv I placed myself into a corner with an example of having e.g.

  • sub-1_task-task1_run-1_bold.json and
  • sub-1_task-task1_acq-X_run-1_bold.json

per subject (should be ok), and then trying to aggregate over them while retaining also _acq- if defined. Then I would end up with

  • task-task1_bold.json
  • task-task1_acq-X_bold.json

at the top level. Is this legit???
Should then _acq-X_bold leaf files inherit also from task-task1_bold.json? It shouldn't be so I guess.
And it is not just a matter of having that constraint "no more than one file may be defined at a given level of the directory structure", because I could potentially place the _acq-X in per-subject directory, thus avoiding it. It is a matter of clear definition on how we "expand" the common filename to match any leaf one. If it is a matter of the fact that we could expand into a leaf file with arbitrary additional _key-value pairs, then the situation above could be "Ok" so that task-task1_acq-X_bold.json could extend (or overwrite, but not delete) fields defined in task-task1_bold.json.

iEEG Spec - ongoing questions

There were several questions about

  1. how to store data recorded during electrical stimulation
  2. how to store iEEG data during ongoing natural behavior (no task, no instruction, not even rest, just natural behavior, which can include many things).

Some options:
a) Like in the MEG emptyroom recording, we have a section here with an example.
b) We add a FAQ section to the Starters Kit, and have a suggestion there.

Questions about BIDS Labeling

We are working on setting up BIDS formatting at UCLA. The goal is take the data directly from the scanner servers and move it either to a personal computer or our cluster and have the data in BIDS format with little effort on the users part. We have been taking our old script and modifying it to follow BIDS format. While we have been doing that, we ran into some questions that we did not know the answers to and though we should ask.

  1. Where should the scouts files go and how should they be named?

  2. This appears to be brought up in another post but it is not clear what the conclusion is. Our MPRAGE images come with NORMalized (intensity normalization performed by the scanner software) and original raw T1w scans. There seemed to be a suggestion to have _proc identify it. What will the label be for the normalized and non-normalized images and will both or only one need that label?

  3. When we have a study with multiple runs, we tend to name the BOLD scan in the scanner with either the same name (e.g. BOLD_AP and BOLD_AP) or by naming each run with a different name (e.g. BOLD_AP_Block01 and BOLD_AP_Block02). This might cause an issue when the code has to determine the run numbers itself if there are different naming conventions. This same issue applies to the task name (e.g. figuring out if it should be task-bold, task-rest, task-loc, etc). We would like to have our code be smart enough to automatically take the name of the run from the scanner and label it correctly in BIDS format. I assume it require everyone to change the name of their scan in the scanner for the BIDS script to work properly (unless there is a better way). Is there any agreement on what the names of the run in the scanner should be?

  4. Relate to the previous issue, what happens if there is a run that was rerun and the first run's data should be discarded. This is could be because the trigger didn't catch, the code had to be restarted, or whatever. In an ideal world, the BIDS code could tell the same scan was run twice and name the second scan the normally and the first scan would have to have some sort of identifier saying this is a throw away scan. I assume this would require us to name each scan with a unique name (e.g. BOLD_AP_Block01 and BOLD_AP_Block02) so it can tell the same scan was ran twice. Has anyone else solved this issue?

  5. This is more of a minor point but I know there is a push to no longer use the "subject" and instead use the term "participant". Unfortunately it seem the BIDS formatting uses sub everywhere. Do we have to use that naming convention? Is there another term we can use? Do scripts like fMRIPrep require sub term to be used?

BEP: make `dir` optional for other than fmap modalities

ATM, "Phase encoding direction" (dir) is a required field for fmap(epi). But it is relevant for other sequences (fmap, dwi) but is not available in BIDS specification for them. To overcome this specification people resort to list the value in a flexible and omni-present acq field:

$> datalad -c datalad.search.index-egrep-documenttype=all search path:.*_acq-ap.*
search(ok): /home/yoh/datalad/openfmri/ds000221/sub-010001/ses-02/func/sub-010001_ses-02_task-rest_acq-AP_run-01_bold.json (file)
...
search(ok): /home/yoh/datalad/openfmri/ds000244/sub-01/ses-00/func/sub-01_ses-00_task-ArchiSocial_acq-ap_bold.nii.gz (file)
...

So it seems that demand is there and I do not see a reason why dir should not be allowed to be used in those cases.

add CONTRIBUTING.md with links/specs of style guide

I realized as I was reading through the raw specification file that I was hesitant to suggest style edits because I didn't know if any decisions had been made about which flavor of markdown we are using, so I didn't know how to suggest changes so things like tables and file paths/directory trees.

I contributing guide with a record of markdown formatting decisions would be useful.

Specifically, the things I wondered about right away were:

  • are all the tables still in html because of a formatting decision or can they be converted to markdown tables
  • is there a decided formatting structure for directory trees? (should they look like tree outputs?) of just bullet-ed/tabbed?

Adding CBV support

Hi @chrisfilo , thank you so much for moving development to Git. This has motivated me to finally get to adding CBV contrast support, as we had discussed on the Google docs page.

Would this be the adequate page to edit for my addition? Is there any model for documenting the CBV field?

Precedence in JSON/NIfTI metadata mismatch

Over in bids-standard/pybids#357, we've been having a small discussion over the precedence of metadata when JSON sidecars and NIfTI disagree. Specifically the question was whether to take the 4th pixdim from the NIfTI or the RepetitionTime entry in the BIDS metadata, which only matters when there is disagreement.

It is the case that, at least for RepetitionTime, a mismatch is treated as an error by the validator. However, in practice, not every dataset is run through the validator before being exposed to pybids or pushed through a BIDS App... In user support for fMRIPrep, we've seen enough issues that would be solved by running through the validator that we now bundle the validator and run it before starting fMRIPrep.

There are other metadata entries that do not produce errors on disagreement, such as PhaseEncodingDirection and SliceEncodingDirection (packed in the dim_info field), SliceTiming (slice_code), which thus need to have a defined precedence (or begin producing errors).

fMRIPrep for one has made a policy of only querying JSON for pretty much everything besides voxel spatial dimensions, affines and data shape, and this seems like the correct approach for BIDS-aware applications. But it is also the case that we have a number of BIDS Apps that are just shims over existing non-BIDS-aware apps, which will thus only have NIfTI headers to rely upon, except where the shim writer also feeds in JSON metadata through another side channel.

I don't really have a satisfying proposal to make here, so this is mostly the start of a discussion. My inclination is that we should continue to prioritize JSON-encoded metadata, and increase the coverage of potential header/sidecar conflicts in the validator. In addition to the idea of respecting BIDS as a structure is the fact that JSON is easier to inspect and edit by hand, whereas modifying NIfTI headers requires more knowledge of some tool that can do the job. But making this a spec requirement may render some existing BIDS Apps non-compliant.

I would also suggest that a useful tool (possibly validator mode) would be to sync JSON metadata into NIfTI headers to resolve mismatches.

This may be somewhat related to #102.

file formats for digitization points in BIDS-MEG specification

There are two issues regarding file formats in the BIDS-MEG specification:

  • Currently, BIDS-MEG only "recommends" certain file formats in Appendix VI. This does not necessarily restrict the file formats in the sense that it is not a "must" or a "should". Should this be turned into a "must" to be consistent with validator?
  • The headshape file formats are supposed to be in the "specific format of the 3-D digitizer’s manufacturer", but this information is completely missing from Appendix VI. What are the file formats that we should support here? One suggestion is to start from the list here

Relevant discussion thread: bids-standard/bids-validator#585

cc @chrisfilo @teonbrooks @sappelhoff @robertoostenveld @monkeyman192

Thoughts welcome!

Instructions for how to update the specification with a BEP

Is there a place with instructions for how to update the BIDS specification with a new BEP? I think we're ready to go soon with BIDS-iEEG, and wonder if we can use this as an opportunity to improve the docs around contributing to the markdown etc.

renaming CTF files

The specification says the following:

The renaming of CTF datasets SHOULD be done using the CTF newDs command-line application.

is there something special about what the newDs tool does which cannot be done by other software? If not, I suggest tweaking the language here a bit:

The renaming of CTF datasets SHOULD be done with a specialized software (e.g., the CTF newDs command-line application or MNE-BIDS).

cc @sappelhoff

Overlap in derivatives for tedana, raw for BEP001

We (@tsalo and I) were looking into creating BIDS derivatives for the tedana package. We've mostly succeeded in mapping our outputs on to existing BIDS derivatives, but there are two file types we're still unsure about.

One of these is a T2* map. Looking through the BEP001 draft, it seems that there's already a defined T2Starmap suffix. We were wondering if it would make sense to use the same suffix, here ? Would it be as simple as setting the Estimation Method field as Posse et al., 1999 ?

The other data type is what we call an S0 map, which doesn't seem to have a correspondence to anything yet defined in BEP001. Is there any correspondence that I'm not seeing, or would this make sense as its own suffix ?

Pinging in some BEP001 folks who might have comments: @KirstieJane @Gilles86 @agahkarakuzu @handwerkerd

maxfiltered data as raw or derivatives

There is conflicting information as to whether maxfiltered MEG data should be represented as as raw (with proc- or as derivatives

The main specification now contains

After applying the MaxFilter pre-processing tool, files should be renamed with the corresponding label (e.g. proc-sss) and placed into a derivatives subfolder.

but also

SoftwareFilters | REQUIRED. List of temporal and/or spatial software filters applied, or ideally key:value pairs of pre-applied software filters and their parameter values: e.g., {"SSS": {"frame": "head", "badlimit": 7}} ...

where the last relates to the meg.json of the raw data. This is in line with "8.4.1 MEG recording data" which gives the example that "proc-xxx" with xxx=sss, tsss, trans, quat, mc, etcetera, can be used for a maxfiltered meg dataset.

SimultaneousRecording and SimultaneousRecordingWith

2 fields

add to dataset_description.json the field SimultaneousRecording to indicates the different imaging modalities acquired at the time e.g. SimultaneousRecording: 'func','EEG','eye tracker', 'physio', 'behav'.

add SimultaneousRecordingWith field within the sidecar files to indicates the different imaging modalities acquired at the time and on different hardware.

e.g. ds000117 SimultaneousRecording: {'MEG','EEG'} but no SimultaneousRecordingWith because meg and eeg data are in the same file and recorded together on the same hardware.

SimultaneousRecordingWith

Tibor solution is to point to all files related to each others: for instance in the run-02_bold.json we would have SimultaneousRecordingWith = { 'eeg/sub-01_ses-01_task-something_run-02_eeg.vhdr', 'eyetracker/sub-01_ses-01_task-something_run-02_eyetracker.asc'}.

Chris G proposed to add the relative root path (e.g. {'/sub-01/func/sub-01_task-something_run-02_bold.nii.gz']) to accommodate hyperscanning (simultaneous recordings across participants).

Robert pointed out that 'It might also be relevant to know whether behaviour (behav) was recorded during the functional brain recordings (e.g. func+behav), or prior to (or after) the functional scan. It is often assumed (e.g. in the main bids spec) that these are recorded simultaneously, but e.g. section 8.7 and 8.8 already have to deal with the simultaneous versus sequential (or separate) measurement of the two. When considering SimultaneousRecording as a general field, it might have the side effect of “behav" becoming a mature data type on its own, rather than an addition to functional brain data. This would make it more symmetric'.

Mainak has an open issue: should we split concurrent recordings: eg split the meg and eeg data of ds000117 ; most felt it is not necessary nor recommended since it comes from the same hardware - one issue to solve is then ensure metadata cover all modalities properly e.g. both MEG and EEG without conflict (seems largely possible to me)

synchronization issue

Since we have multiple modalities, each one should have their own timing information recorded using events.tsv files. This is important since different frequency sampling exists for each modality and often clock on different hardware run a different speed. If there is no event (rest) we need at least one marker to ensure this is synchronized (a typical case in point is starting the eeg recording before the MRI starts).

Chris G pointed out that to compare events.tsv they should probably have the same number of rows since there is no unique identifier for events.

future format to be adopted like xdf contains multimodal data (say video screen cap, eye tracker, physio and eeg) with a master clock, and specific clocks -- again splitting files seems redudent and not necessary - but under which folder such file will appear ; my idea would be that's it the experimenter to know / decide and put it under the primary measure of the study.

BEP: Make proc filename keyword optional for other (T1, EPI, etc) modalities

Continuing my trend (#47) of demands, I would like to add _proc to the campaign.
The original usecase (nipy/heudiconv#266) is NORMalized (intensity normalization performend by the scanner software) T1w and T2w anatomicals. NORM'ed files accompany non-normalized original raw data. Sure thing I could just consider them "derivative" images and drop them, but some researchers might believe in that preprocessing superiority and demand them to be present in their raw BIDS datasets since that is what they get from the scanner.
Given that there is now _proc field (introduced optional for stim and meg) I do not see why it should not be allowed (optional as well) for pretty much any other modality (in particular anat, bold, dwi).

Mapping from google doc version of specification to github directory structure

Hi - I think this is probably a good question for @franklin-feingold but anyone who knows the answer is welcome to reply!!

I'm looking to match up some of the sections in the released version of the BIDS specification (v1.1.1): (https://bids.neuroimaging.io/bids_spec.pdf) and the locked google doc with the read the docs version this repository builds: https://bids-specification.readthedocs.io/en/stable.

Specifically section 8 (Detailed file descriptions) (google doc) seems to have disappeared. I think its now in 04-modality-specific-files?

I had a read of the src/CHANGES.md file but I think it's a bit difficult to see where the restructuring happened.

There's also the TOC.md file but I think that is just a copy from the old google doc? The links don't go anywhere. Might be a good reference for clarifying the mapping though!

Does anyone have a reference to which parts have been mapped to where?

If it doesn't exist then I can make a new issue to make it clear that it's a task that would welcome a contributor to help with 😄


Update: I think from reading src/CHANGES.md that there's actually a released 1.1.2 version....I'm not sure how to read that on readthedocs though 😬

Use dhall for BIDS specification and reference implementation

Recently a configuration language called dhall has been created to solve many of the problems with configuration files. https://github.com/dhall-lang.

Dhall provides a typed and modular alternative to json/yaml. It additionally provides the ability to include functions in the configuration (e.g. defaults for a field can be computed from other fields in the configuration).

I think this language provides a natural way to formalize the BIDS specification. The specification of BIDS is a type specification (which fields are optional, what kinds of values are allowed, etc.). This can be capitalized on, once the spec is a type schema there is no need for an independent validator, dhall ensures the types are correct at every I/O step.

Additionally, JSON and dhall are interconvertible (provided a type schema is provided when going from JSON to dhall) so BIDS data-sets using JSON will be forward compatible.

I recognize that there are substantial risks associated with using relatively new and niche software for large projects. However, types provide safe-guards against many very common classes of bugs that could exist in the specification and validation of BIDS files, which I think justifies the change. Regardless of personal views on the utility of static typing in a general programming context, the advantages in a configuration/specification context should be clear.

In the near term I'm going to work on a subset of the BIDS spec in dhall for comparison and interest sake. Any thoughts, questions, or feedback would be appreciated!

tsv metadata in json files

So I am working on making my temporal network package teneto to comply with the RC1 BIDS derivatives. Among other things it creates connectivity matrices and time-varying connectivity matrices (I'm going to suggest the suffices "_conn" and "_tconn" when we can make suggestion again).

I am thinking what is the best way to store these as to make it BIDS compatible and having a slight problem regarding tsv metadata. So the tsv files will look something like this:

i    j    value    [time] 
0    1    0.5      0
0    1    0.55     1
...

or (more likely)

i    j    value_at_t    value_at_t2 
0    1    0.5           0.55
0    2    0.4           0.41
...

However the problem I am running into is regarding the metadata for the tsv files.

At the moment the json file for tsv files are defined in the BIDS specification:

Tabular files MAY be optionally accompanied by a simple data dictionary in a json format (see below). The data dictionaries MUST have the same name as their corresponding tabular files but with .json extensions. Each entry in the data dictionary has a name corresponding to a column name

So at the moment, from what I can tell, there is no way to put in metadata for an entire tsv file in the json (unless I have missed where this is specified). The metadata must be for each column. This is sub-optimal for me use case. If I want the json sidecar to include metadata about how the connectivity matrix were defined (e.g. if Spearman or Pearson correlations were used to make the connectivity matrix, if it is symmetric or directed connectivity matrix, if it reflects functional or structural data, any additional transforms performed after the correlations were made), there is not necessarily a good column to put this information in as it reflects the entire file.

A couple of suggestions to solve this:

  1. Allow a non-column entry in the json (e.g. "__global__") for json sidecars that accompany tsv files. If this solution, the BIDS specification will have to be modified to allow this.
  2. This type of data should not be saved as a tsv. Connectivity data should only be specified in cifti formats (I don't think this is the best solution personally, but I can see why some would suggest it).
  3. Place the global metadata in the columns most relevant for that property. E.g. in the voxel/node index columns i and j the metadata could include if the connectivity matrix is directed or not and the. This would not require any change to BIDS but, for usability, cramming metadata in slightly unexpected places seems suboptimal.

My vote is option 1 as that would also solve any similar problem that arises in the future without having to shoehorn global metadata into column data.

naming multiple marker files for KIT MEG systems

Currently the specification only has details on how to name a single marker file associated with KIT MEG systems (see here).
MNE-python however is able to handle multiple .mrk files and averages them if more than one is provided.
I wrote a PR to export the .mrk files with MNE-BIDS (mne-tools/mne-bids#114) but since we have no specification on what to call the files if multiple are used this PR cannot continue until this issue has been resolved.

While the majority of the time it seems only one .mrk file is ever used with any given .con file, sometimes the researchers will do a -pre and -post recording .mrk file. To this end I propose possibly having a -pre and -post tag after the _markers tag in the file name.
Eg.

sub-control01_ses-001_task-rest_run-01_markers[-pre|-post].<mrk,sqd>

Not sure if this is the best way to do it, so I am very open to better suggestions.
@teonbrooks

mismatch between name of a *.nii.gz file and the *.nii file it contains

I posted it somewhere else but I figure this is also a potential BIDS issue.

AFAIK BIDS allow both *.nii and *.nii.gz format but there is no requirement for the *.nii file inside to have the same name as the *.nii.gz file it is in. Moreover several *.nii.gz in a folder can contain *.nii files that despite being different in terms of content have the same name.

For example the files sub-001_task-MGT_run-01_bold.nii.gz and sub-001_task-MGT_run-02_bold.nii.gz can both contain a file that is called vol0000_xform-00000.nii.

Maybe I am missing something on how unzipping works but I think this could sometimes be an "inconvenience" for (mostly matlab/spm) users when the first thing they will do is unzip everything. So maybe it would be good to recommend in the main specs that those names are matched.

Rename variables in document

  • steps: steps, components, stages, chunk
  • level: type, group, identifier, leveLevel
  • model: models, estimation
  • transformations: transforms, operations, ops, xforms, xfm
  • variables: predictors, X, IV, ivars, X_cols, X_names, dataframe, features, variable_names
  • outcome: y, DV, response_variables, labels
  • variance_structure: var_comps, rfx,
  • link_function
  • error_distribution

BRAIN Initiative Standards Grant for September 6, 2019 submission

Discussion in #104 raised the idea of getting BRAIN Initiative funding to support ongoing maintenance. I'm starting this thread for discussion of what that support might look like.

I'm envisioning creation of a smallish BIDS board (5 or so individuals) made up of co-investigators on this grant who each get ~10% salary support. The BIDS board would be tasked with filling 20 year-long BIDS Maintainer positions, each of which would be 5%-10% effort positions also compensated through funds from the grant. Putting some really rough numbers to it, the yearly budget for such an effort would be ~$180,000. We could potentially ask for some funding for education an outreach activities for BIDS as well.

The next submission period for BRAIN Initiative standards grants is September 6th, 2019, and the full RFA can be found here.

Summary of the BIDS Stats Models meeting (Stanford 17-19 of October 2018)

img_0177
(missinig from the picture - had to leave early, we were too much in the flow to take the picture earlier - JB Poline, Jeanette Mumford, Ross Blair, Russ Poldrack, Dav Clark, Hernando Ombao, Ming Zhan, Rastko Ciric, and Jessey Wright)

The GitHub project outlines the remaining action items. Considering the release schedule of Derivatives we aim at releasing RC1 of Stats Models at the end of November and aiming to merge it into the main specification in February-March 2019.

Many thanks to our project manager Franklin Feingold for organizing the event.

Standardize .bidsignore

ATM bids-validator supports .bidsignore file and describes it as

.bidsignore

Optionally one can include a .bidsignore file in the root of the dataset. This file lists patterns (compatible
with the .gitignore syntax) defining files that should be ignored by the
validator. This option is useful when the validated dataset includes file types not yet supported by BIDS specification.

*_not_bids.txt
extra_data/

while also hardcoding some additional ignores:

    .add('.*')
    .add('!*.icloud')
    .add('/derivatives')
    .add('/sourcedata')
    .add('/code')

In the Python land of pybids there is a constant battle to hardcode various additional ignores, see e.g. bids-standard/pybids#277 (comment) and a common consensus is that supporting .bidsignore would be the way to go forward. For that to happen properly, and to not have varying/possibly conflicting support of .bidsignore file across software toolkits, it is desired to describe that in BIDS specification document somewhere, e.g. a new 99-appendices/09-software-support.html, stating our expectations for that file syntax. I see two choices

  1. unification with .gitsignore - sounds like a nice idea, but I am afraid it might be also a fragile one unless we start use git itself, i.e. git-check-ignore (with hard-dependency on git, which I am not) since through time Git might introduce/change .gitignore specification. ATM (if I got it right) bids-validator relies on https://www.npmjs.com/package/ignore to provide JS implementation. For pybids we could probably find some Python implementation, but so far the ones I found are too adhoc and/or old/abandoned (e.g. https://github.com/snark/ignorance), so calling out to git-check-ignore sounds like a best idea
  2. prescribed subset of .gitignore - we clearly describe what patterns are supported etc.

What do you think the @bids-standard/everyone ?

Recap table for "volume timing"?

As it is now there are different ways to describe when each functional volume was acquired using the 5 following fields (AFAICT):

  • DelayTime
  • AcquisitionDuration
  • SliceTiming
  • RepetitionTime
  • VolumeTiming

Some are of those fields are mutually exclusive and some options are only available for sparse sequences.

I had started working on a table to add to the specs that would recapitulate all those possibilities and might be a bit quicker than parsing sentences like:

If the field is not present it is assumed to be set to zero. This field is REQUIRED for sparse sequences using the RepetitionTime field that do not have the SliceTiming field set to allowed for accurate calculation of "acquisition time". This field is mutually exclusive with VolumeTiming.

Should I give it a go and see if it helps?

BEP006 Merging Roadmap

The preprint for BEP006 has been published, soon to be submitted to Nature Scientific Data. But there is some work that remains to be done.

Here is a suggestion on how to proceed with the final steps of BEP006. Different approaches welcome, but I think it'd be good to have a central public space where this is discussed.

  1. Getting feedback on the preprint and the current state of the extension by the community in the Google Doc
  2. Start a pull request in this bids-standard/bids-specification repository to merge the BEP006 specification in form of a markdown document, similar to the MEG document ... see #108
  3. Once all feedback has been obtained and worked into the PR for bids-specficiation, merge the main text of BEP006 into master
  4. Once this merge is completed, remove the bep006 flags from the bids-validator, see https://github.com/bids-standard/bids-validator/pull/681/files
  5. Once the validator accepts EEG data, merge the EEG examples branch
  6. At this point, all extensions (specification, validator, examples) have been merged into the main specification ... then we can start tracking outstanding enhancements, which we'll find by their EEG tag in either of the main bids-standard repositories on GitHub

Agreement or disagreement+alternative suggestions are welcome!

cc: @ChristophePhillips, whom I somehow could not assign via the "Assignees" Github tool

Use code segments instead of quotes for examples and file paths

{
   "Units": "rad/s",
   "IntendedFor": "func/sub-01_task-motor_bold.nii.gz"
}

Template:

sub-<participant_label>/[ses-<session_label>/]
    fmap/
        sub-<label>[_ses-<session_label>][_acq-<label>][_run-<run_index>]_magnitude.nii[.gz]
        sub-<label>[_ses-<session_label>][_acq-<label>][_run-<run_index>]_fieldmap.nii[.gz]
        sub-<label>[_ses-<session_label>][_acq-<label>][_run-<run_index>]_fieldmap.json

[BEP014] Open questions to resolve

These questions were written on BEP014 before our conference call on 09/28/2018. Some of them were addressed in that call, and some may still be open. This ticket is here just to keep track of that conversation and open the contents of the call to the community:

Caveat to Spaces:
Many software packages change the order of voxels or resolution of the standard templates. When the Space keyword refers to a key, it simply means that
Questions to resolve:

  1. Should we drop XXX before CoordinateSystem and CoordinateSystemUnits? It sounds like this is helping group some information. It would be useful to chat about this.
  2. How should multiple coordinate systems associated with a file be represented? (e.g., NifTI, CIFTI, EEG + MEG)
    2a. We discussed the idea of lists within spaces as example above
  3. Should the CoordinateSystem key be mandatory?
  4. How will CoordinateSystem metadata be represented, when a custom coordinate system is used?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.