preprocessed-connectomes-project / abide Goto Github PK
View Code? Open in Web Editor NEWABIDE Preprocessed Initiative
ABIDE Preprocessed Initiative
Hi,
I tried to run the "download_abide_preproc.py" script to download ABIDE preprocessed data from S3, but met the following error:
urllib.error.URLError: <urlopen error [Errno 60] Operation timed out>
I am wondering if the given url is still working? Or is there any update for that?
Any suggestion or comment is much appreciated!
Thank you!
Guixiang
Hi,
I am in the process of running subjects through the NIAK pipeline for ABIDE ; however, I am finding that the mean FD values I get do not match with what is in the existing ABIDE database I downloaded off of http://preprocessed-connectomes-project.org/abide/. Is there a repository or place I can view every step of the NIAK pipeline, so I can replicate the same results that was in the spreadsheet? Thank you!
Hi,
I downloaded the ABIDE raw and preprocessed data from the NITRC website. I have CPAC installed on my workstation and when I ran the pipeline, the preprocessed results (roi time series) for certain sites do not match the results posted in the website (with significant differences) . I used the same configuration pipeline provided on the website. Are there any other site-specific parameters while running the pipeline apart from slice timing pattern?
Thanks. Response is much appreciated!
Meenakshi
Hi, the website you quoted to for more detail about the labeling protocals of the ROI-based morphology measuring is out of date. And this is the new website I found in the Mindboggle :DKT cortical labeling protocol (31 labels).
Hello,
I'm trying to download T1w images for certain age groups. All the download pipelines seem to be centered around metrics functional imaging. I am able to download the functional nii's without issue, but I do not see a simple way to download the T1w. I've gone through the parameters/derivatives from the pulldown code and I simply do not see an option for pulling T1w's.
Any help would be appreciated.
Hi
The scan time is different for some subjects in the KKI dataset. Does disappointment occur at the beginning or end of the scan?
the best,
Hello,
Thanks for making these resources available, but I'm running into an issue. I downloaded the imaging data from XNAT nitrc server. I don't see any information about participant age anywhere. Could you please help out? The DOB is empty as you can see, I would need their age at scan. Thank you in advance
The articles on the ABIDE preprocessed publication list seem to all be for the ADHD200 dataset.
I downloaded the CPAC preprocessed ABIDE data using the dosenbach atlas [1]. Although the atlas consists of 160 ROIs, the datasets contain 161 variables. Some documentation that explains the extra variable would be helpful. I suspect the 161st column is the global mean signal. Would be great if someone can confirm this.
Dosenbach atlas from
https://fcp-indi.s3.amazonaws.com/data/Projects/ABIDE_Initiative/Resources/dos160_roi_atlas.nii.gz
contains an extra label
>>> import nibabel
>>> import numpy as np
>>> img = nibabel.load('dos160_roi_atlas.nii.gz')
>>> print(np.unique(img.get_data()))
[ 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35
36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53
54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71
72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89
90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107
108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125
126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143
144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 223]
>>> np.where(img.get_data()==223)
(array([33]), array([24]), array([30]))
The last region label 223 is not in the Dosenbach atlas but just an intersection between regions label 108/MNI (-5, -52, 17) and label 115/MNI (-11, -58, 17) which are too close.
According to the documentation, you should have 110 rois for this parcellation. However the actual dataset has an extra variable, I suspect variable 83 which does not seem to correspond to any legitimate cortical or subcortical region. The nifti atlas provided in the documentation suggests that 83 corresponds to two seemingly arbitrary voxels.
Also note that the documentation for many of the atlases do not provide ROI label files.
Harvard-Oxford (HO): The HO atlas distributed with FSL is split into cortical and subcortical probabilistic atlases. A 25% threshold was applied to each of these atlases and they were subsequently bisected into left and right hemispheres at the midline (x=0). ROIs representing left/right WM, left/right GM, left/right CSF and brainstem were removed from the subcortical atlas. The subcortical and cortical ROIs were combined and then fractionated into functional resolution using nearest-neighbor interpolation. [Atlas][Labels]
Where can I find the table describing the aquisition detail of ABIDE dataset? For eg. the MRI scanner used ,size and resolution of several images, time duration of each patient etc.
Hello,
I tried this link to get data from S3:
https://s3.amazonaws.com/fcp-indi/data/Projects/ABIDE_Initiative/Outputs/freesurfer/5.1/50002/surf/lh.pial
But I get an error. Am I doing something wrong?
Here's the error:
<Error>
<Code>NoSuchKey</Code>
<Message>The specified key does not exist.</Message>
<Key>data/Projects/ABIDE_Initiative/Outputs/freesurfer/5.1/50002/surf/lh.pial</Key>
<RequestId>809A3F0391F8BD8E</RequestId>
<HostId>fYcV661vrZd4jjhRa5Qo5bHLiTxww8+qsiBoe4t5CMyNFRCUzPT4in+QPSqS+CmhRRMphsUS6cQ=</HostId>
</Error>
Is there a script for running Freesurfer for all subjects? I would like to use the same Freesurfer pipeline for ABIDE 2
Is it possible to get dependencies, especially C-PAC and other libraries version to reproduce the results and extend to ABIDE-II dataset?
Hi there!
We're working on surface-based analysis of resting-state data from ABIDE. So whttps://github.com/preprocessed-connectomes-project/abide/issues/18e're trying to use freesurfer's full output directory that we were able to download together with the preprocessed functional data in order to project the functional data onto freesurfer's meshes... (before doing whatever we'd like to do on the surface ;) )
However, the preprocessed functional data is in MNI space and it seems freesurfer's recon-all has been run in the subject's native space. So the two are not aligned and we cannot project directly the functional preprocessed data onto the surface...
So how would you do it? One of my ideas was to try to get the transformation "Anatomical2MNI" files that have been estimated to put the functional data into MNI space, to perform the inverse transformation in order to put the functional data back into native space (if possible), but I cannot find those transformation files... Are they available somewhere? If not, how would you do it?
Thanks for your help,
Sylvain
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.