Beauchamp:CorticalSurfaceHCP

From OpenWetWare
Jump to navigationJump to search
Brain picture
Beauchamp Lab



Other Atlases

There are a number of surface atlases, including the Aalto University atlas of identified visual areas http://ltl.tkk.fi/wiki/Atlas

Introduction

Glasser et al. (Nature 2016) used the HCP dataset (including resting-state functional connectivity, T1 and T2 images) to parcellate the cerebral cortex into 180 areas per hemisphere, creating an atlas known as the HCP MMP (multimodal parcellation) 1.0 atlas. A version of this parcellation converted to the MNI152 volumetric atlas space is now released as part of AFNI. You could also use SUMA to apply this parcellation to your own subjects while keeping the registration in surface space. A description of how the atlas was made and some examples of how you may want to apply this to your own data are included below.

How the Atlas was made

Converting the HCP MMP atlas to the surface space of the MNI template

Part of this process (specifically, converting the MMP atlas to FreeSurfer's fsaverage space) was completed by Kate Mills. https://figshare.com/articles/HCP-MMP1_0_projected_on_fsaverage/3498446

We then converted the fsaverage version to the surface of the MNI152_T1_2009c template, that is also included in AFNI. (Commands are written once for both hemispheres for brevity, but you do have to run them for each hemisphere separately.)

# convert to a filetype that both Freesurfer and AFNI/SUMA can appreciate
mri_convert ?h.Glasser2016.mgz ?h.Glasser2016.nii
# Create a surface reconstruction of the template
 # start by zeropadding the atlas, otherwise, FS will move it just ever so slightly, 
 # and the resulting files won't actually be aligned to the template.
3dZeropad -prefix ./MNI_Atlas_zp.nii -RL 256 -AP 256 -IS 256 /Applications/AFNI/MNI152_T1_2009c+tlrc
 # run recon-all, and then SUMA-ize the output
recon-all -all -sid MNI_Atlas -i MNI_Atlas_zp.nii
@SUMA_Make_Spec_FS -NIFTI -sid MNI_Atlas

 # now that the surface space for the MNI template is defined, use Freesurfer to move from fsaverage to MNI
mri_surf2surf --srcsubject fsaverage --trgsubject MNI_Atlas --sval atlasmgz/?h.Glasser2016.nii \
--tval MNI_Atlas/SUMA/?h.Glasser_HCP.gii --mapmethod nnf --hemi ?h
 # Now, in SUMA world, you have a native space mesh and the std.141 mesh.  We want the parcellation in the 
 # std.141 mesh so it can be applied to anyone. 
SurfToSurf -i_gii std.141.?h.smoothwm.gii -i_gii ?h.smoothwm.gii -prefix res.141.lh -mapfile \
std.141.MNI_Atlas_?h.niml.M2M -dset ?h.Glasser_HCP.gii -output_params NearestNode
 # rename this so it doesn't look silly. 
mv res.141.?h.?h.Glasser_HCP.niml.dset res.141.?h.Glasser_HCP.niml.dset

At this point, we have a version of the Glasser Atlas on the std.141 (SUMA template) mesh, but the results of the transformations from brain to brain and mesh to mesh left it looking a little jagged. We went back to smooth this later.

Converting the HCP MMP atlas to the volume space of the MNI template

One thing we noticed is that the recon procedure on the template resulted in rather thin ribbons. This is probably because the contrast from the template (which comes from averaging 152 brains) was a little less defined, and so the white/grey matter border ended up being (consistently) not quite where we would have put it had we done it by eye. We'll fix that next.

 # blur the ribbon file (that defines where the grey matter is) in a second, we're going to take a more generous 
 # threshold for this, so the end result is that the ribbon is 'fatter' 
3dcalc -a ?h.ribbon.nii -datum float -prefix ?h.fl.ribbon.nii -expr 'a'
3dmerge -1blur_fwhm 2 -doall -datum float -prefix ?h.blur.ribbon.nii ?h.fl.ribbon.nii
 # Sample from the surface back to the volume, filling up the (fatter) cortical ribbon
@surf_to_vol_spackle -spec std.141.MNI_Atlas_?h.spec -surfA smoothwm -surfB pial -maskset \
?h.blur.ribbon.nii’<0.25..2>’ -surfset res.141.?h.Glasser_HCP.niml.dset -mode -prefix ?h.Glasser_HCP.fill


Clean up, and Atlasize

At this point, we have two files (left and right) with the parcellations in volume files, aligned to the MNI template. However, they're a little messy from all the transformations, and they're missing some information that makes them more like a useful atlas. At this point, labels were added, the files were told that they were in MNI space (they didn't know), left and right were separated by 1000, and the two sides were added together.

We also smoothed using a modal smoothing.

 3dLocalstat -stat mode -nbhd 'SPHERE(2)' -prefix hcp_mode2 Glasser_HCP_labels.nii
3dcalc -a mode.MNI_Glasser_HCP.nii -b MNI_Glasser_HCP.nii -expr '(step(b)-step(a))*b + a' \
-prefix smooth.MNI_Glasser_HCP.nii

Coordinates were also generated for each parcellation. This is trickier than it sounds on the surface (get it?) because the parcellations are oddly shaped. First, because they exist in the grey matter ribbon, they are folded such that the (volumetric) center of mass of the parcellation is often a point that is not included in the parcellation. Then, even when defined on the surface, many parcellations are C-shaped such that the centroid vertex has the same problem. To get around this, Freesurfer’s mris_divide_parcellation was used to cut each parcellation into equal pieces “perpendicular to the long axis.” Each division was set to be 75mm2 in size. This size was chosen arbitrarily but set to be small enough that each division was essentially a thin strip (and thus regularly shaped). The middle division (or in the union of the two central divisions) was identified. This region will have equal area on either side of it. The centroid vertex of this region was calculated to determine the coordinates for the parcellation. To ensure that these were, in fact, still in the volumetric grey matter ribbon, a 4mm sphere was calculated around the calculated coordinate, the union of this and the parcellation was taken, and the center of mass (volumetric) of that was kept as the final coordinate. All of these steps, while they create a reasonable coordinate for each parcellation, have considerable uncertainty attached, and no efforts whatsoever have been taken to characterize how much of an effect that may have on, for instance, seed placement for functional connectivity analyses. Use at your own risk, and don’t come crying to me if it doesn’t work. (Alternatively, if it works really well for you, I’ll take payment in chocolate.)

If you want to know more details (like commands) for how this was done, you may ask. But I'm warning you up from the code is messy and hard to follow, and I have no intentions of cleaning it up.

Back to the Surface

I told you I'd come back to this. As I said, the surface (std.141) files were a bit jagged. So, we did the volume processing like we described above, but at a finer resolution (0.5mm iso) and put them back on the surface. This just smoothed over those boundaries a tad.

3dVol2Surf -spec ../../SUMA/std.141.MNI_Atlas_?h.spec -surf_A smoothwm -surf_B pial \
-sv ../../SUMA/MNI_Atlas_SurfVol.nii -grid_parent ?h.hcp_mode2+orig -map_func mode \
-f_steps 5 -f_index nodes -out_niml ?h.MNI_Glasser_HCP.niml.dset
3dcalc -a ?h.MNI_Glasser_HCP.niml.dset -b res.141.?h.Glasser_HCP.niml.dset -expr \
'step(a)*a + (1-step(a))*b' -prefix ?h.std.141.Glasser_HCP.niml.dset

 

What you could do with the Atlas

Viewing the Surface files

Create standardized cortical surface models (std.141) for your subject(s). See Cortical Surface models overview for details.

Copy the std.141 HCP atlas into the subject directory. There are two files, one for each hemisphere. For instance, if we would like to see the labels on the N27 standard brain,

 cd /Volumes/data/BCM/N27/suma_MNI_N27
 cp ?h.std.141.Glasser_HCP.niml.dset .

start Suma and load the annotation files. Make sure you load the std.141 brains or the labels will be incorrect.

 suma -spec std.141.MNI_N27_both.spec &
 DriveSuma -com surf_cont -load_dset lh.std.141.Glasser_HCP.niml.dset -surf_label lh.smoothwm.gii  -view_surf_cont y -switch_cmap ROI_i256

Different values/colors on the surface correspond to each atlas label.

You would probably want to do something similar for each of your subjects, all of which have had surface reconstruction completed, and you've run @SUMA_Make_Spec* so they have a std.141 mesh for themselves. Once your subjects have the std.141 mesh, you can follow along replacing your own subject's directories and name for MNI_N27.

Applying to functional data

For a different example, let's say you're hoping to plot some functional data in one of these regions, specifically. We'll assume you've run FreeSurfer recon-all, @SUMA_Make_Spec_FS, and your afni_proc.py script (or whatever else you've used to generate some functional data.)

As above, you need to sample the atlas to your own subject.

 cp /Volumes/data/scripts/?h.std.141.Glasser_HCP.niml.dset .
 

Now, you either need to sample your functional data to the surface, or sample the atlas to your functional volume. In the first example, we'll move the functional data to the std.141 mesh on the surface (the atlas is really only defined on the surface, after all.) In this example, as well, we're converting all the volumes of the stats file. All of the following will need to be done for both hemispheres.

 # Sample functional data to surface
 3dVol2Surf                                \
      -spec         $SUBJECTS_DIR/$sid/SUMA/std.141.${sid}_${hemi}.spec \
      -surf_A       smoothwm                 \
      -sv           $SUBJECTS_DIR/$sid/SUMA/${sid}_SurfVol.nii         \
      -grid_parent  stats.${sid}+orig           \
      -map_func     mask                     \
      -out_niml       stats.${sid}.$hemi.niml.dset

Now, your stats output has been resampled to the std.141 mesh. Since your atlas is already in this space, you just need to define the mask from the atlas with somthing like....

 3dcalc -a std.141.${hemi}.HCP.annot.niml.dset \
    -expr 'amongst(a,104,124,175,174,24,173,175,28,25)' \
    -prefix STG_mask.${hemi}.niml.dset

And take an average of your stats file in the mask.

 3dmaskave -q -mask STG_mask.${hemi}.niml.dset stats.${sid}.${hemi}.niml.dset > stats.${hemi}.out.txt

Your output file here is the average within the mask of your statistics. These will be in the order they are in the stats.${sid}+orig file. To find out what that is, run

 3dinfo -verb stats.${sid}+orig

Viewing the Volume files

To make viewing easier, DriveSuma can be used to automate loading of the files.

set h = /Volumes/data/scripts
set p = `pwd` 
afni -R -niml &
suma -spec std.141.MNI_N27_both.spec -sv T1.nii &
DriveSuma -com surf_cont -load_dset lh.std.141.Glasser_HCP.niml.dset -surf_label lh.smoothwm.gii  -view_surf_cont y -load_cmap {$h}/nice.1D.cmap -Dim 0.6
DriveSuma -com surf_cont -load_dset std.141.rh.sulc.niml.dset -surf_label rh.smoothwm.gii  -view_surf_cont y -switch_cmap nice.1D -Dim 0.6
DriveSuma -com viewer_cont -key b  -1_only n
DriveSuma -com viewer_cont -key F3
DriveSuma -com viewer_cont -load_view  {$h}/nice.niml.vvs -com surf_cont -switch_surf lh.inf_200.gii
DriveSuma -com viewer_cont -key t
DriveSuma -com surf_cont -1_only n
plugout_drive -com 'SWITCH_OVERLAY MNI_Glasser_HCP+tlrc' -com 'SEE_OVERLAY +' -quit

Right click on the AFNI color bar and choose the "ROI_i256" color bar for optimal viewing.

Applying to Functional Data

As an alternative to resampling your statistics to the surface, you could sample the atlas from the surface to the volume. Most of this procedure is described above, but for clarity, it would look something like...

 3dresample -master stats.${sid}+orig -inset MNI_Glasser_HCP+tlrc \
    -prefix HCP.volume.res.${hemi}.nii
 3dcalc -a HCP.volume.res.${hemi}.nii -expr 'amongst(a,104,124,175,174,24,173,175,28,25)' \
    -prefix STG_mask.${hemi}.nii
 3dmaskave -q -mask STG_mask.${hemi}.nii stats.${sid}+orig > stats.${hemi}.out.txt