CANlab Robust Regression Walkthrough
This script demonstrates how to use the CANlab robust regression toolbox to run robust 2nd-level analyses.
Robust regression is a popular approach for principled down-weighting of outliers in a General Linear Model analysis. Robust regression has particular advantages when conducting maps over many tests, as in many neuroimaging analyses. In such cases, outliers cannot be easily checked and dealt with for each individual test. The application of robust regression to neuroimaging is described in this paper:
Theory
• Outliers can violate assumptions, have very large effects on regression coefficients
• High-variance observations tend to dominate if their observed values are extreme
• When assumptions cannot be checked at each voxel, automatic procedures for weighting based on outlier status advantageous
• Robust regression: An automatic procedure for identifying cases that are potential outliers and down-weighting
• Iterative GLS, estimation procedure, but weights are based on residual value rather than variancePotential way to deal with heteroscedastic variances
Installation and code
Toolbox installation
In brief, you will need SPM software (for image reading/writing only) and the Robust toolbox on your Matlab path. For full functionality, you'll also need the CANlab Core Tools repository, including subfolders, on your path. Download or "clone" (add to your Github destkop) these two toolboxes and add each to your Matlab path with subfolders:
In addition, you will need Statistical Parametric Mapping (SPM, SPM2/5/12) installed and on your Matlab path. This is used for image reading and writing, but not for statistics.
If you are using robfit_parcelwise (see below), you will also need:
Installation. To install these, place the source code folders in a folder on your hard drive, and type
>> pathtool
at the Matlab command prompt.
Select each of the folders named above, and select “add with subfolders”
SPM and Matlab signal processing toolboxes required
The robust regression toolbox is not an SPM toolbox per se. It uses SPM image manipulation (data I/O) functions, but does not rely on any SPM functions for statistics. It does, however, use the Matlab Statistics Toolbox, so you will need that as well.
Main functions in the robust regression toolbox
The main functions used in the toolbox are:
robfit.m
Runs robust regression for a series of group analyses, creating a directory and saving results for each
robfit_parcelwise.m
Runs robust regression for a single group analysis on each of a series of "parcels", regions of interest that are larger than voxels and generally defined to conform to pre-defined anatomical or functional boundaries.
robustfit.m
Matlab's internal function, which is run for each test conducted across brain voxels or parcels.
robseed.m
Robust regression correlation analysis, correlating each voxel with a seed region you specify.
robust_results_batch.m
Threshold maps and generate and save plots and tables of results.
publish_robust_regression_report.m
Thresholds maps and generates an HTML report with results
Note: If you are using object-oriented tools in CANlab Core, the regress( ) method for fmri_data objects also has a robust regression option. This works independent of the robust regression toolbox.
Sample datasets
A sample dataset is in the "Sample_datasets" folder in CANlab_Core_Tools.
See canlab.github.io for more sample datasets, and in particular the function load_image_set( ) for some easy, pre-packaged datasets, atlases, and maps. This walkthough will use emotion regulation data in the "Sample_datasets" folder, in this subfolder: "Wager_et_al_2008_Neuron_EmotionReg"
The dataset is a series of contrast images from N = 30 participants. Each image is a contrast image for [reappraise neg vs. look neg] for one participant.
These data were published in:
Wager, T. D., Davidson, M. L., Hughes, B. L., Lindquist, M. A., Ochsner, K. N.. (2008). Prefrontal-subcortical pathways mediating successful emotion regulation. Neuron, 59, 1037-50.
We will load it with load_image_set( ).
Robust regression with robfit.m
Robfit runs a series of analyses. Each analysis is a univariate robust regression analysis, regressing data from each voxel in a series of images on a GLM design matrix.
- The typical use case for a single analysis is to enter a series of contrast images, one for each of n participants.
- robfit can handle a series of contrast images, with one set of n images per contrast. It creates directories called robust0001, robust0002, etc., one per analysis (i.e., per contrast).
- A SETUP.mat file for each analysis contains information about the input images and design matrix, along with some other meta-data.
Covariates
You can run robfit to test the group mean (intercept) only or you can enter one or more covariates. Each covariate is entered as a predictor in the design matrix.
As with any multiple regression, the intercept will be tested (with associated t- and p-values) at the zero-point of all covariates, i.e., when they are zero. Therefore, if you mean-center covariates and use effects codes (1, -1) for categorical variables, the intercept will be interpreted as the group effect for the average participant, assuming input images correspond to participants.
Robfit will save t-maps and P-maps for each regressor in your design matrix, with the intercept first (rob_tmap_0001.nii) followed by covariate effects if covariates are entered (rob_tmap_0002.nii, etc.)
Robfit automatically mean-centers continuous covariates, but does not mean-center categorical covariates, and instead codes them with effects codes of 1 and -1.
robfit inputs
robfit uses a data structure with a specific format that stores information about image files and the second-level experimental design. This structure has a special variable name here, by convention: EXPT.
The main 2nd-level analysis function is called robfit.m. Once you create an EXPT structure, you will pass it into robfit.m.
Here is a list of the fields in EXPT required to ultimately run analyses with robfit:
EXPT.subjects List of subject directory names, in a cell array.{‘subj1’ ‘subj2’ … etc.}
EXPT.SNPM.P Cell array; each cell contains names of a set of images (one per subject) to be subjected to a group analysis. Each image list is a string matrix of image file names. Thus, images for multiple 2nd level analyses can be specified at once by entering images for each analysis into separate cells in EXPT.SNPM.P
EXPT.SNPM.connames String matrix of names, whose rows correspond to names for the analyses. Each row (name) corresponds to the cell with the same index in EXPT.SNPM.P.
E.g.,
['Faces-Houses'
'Houses-Faces']
EXPT.SNPM.connums A vector of integers numbering the analyses, and corresponding to cells in EXPT.SNPM.P. These numbers determine output directory names, which will be called:
robust_0001
robust_0002
…and so forth (one directory per 2nd-level analysis)
EXPT.mask Name of mask image containing in-analysis voxels. This need not be in the same space as the images you’re analyzing.
EXPT.cov Matrix of subjects x covariates for analysis of between subjects effects. One row per subject, one column per predictor. Do not include an intercept here. It will be added to the front end of the design matrix, as the first predictor.
The output images, i.e., rob_tmap_0001.img, are numbered so that 0001 is the intercept (overall activation) and 0002 – 0000x are the maps for covariates 1 thru (x-1)
Load the dataset
% This script uses object-oriented tools, which we cover in another tutorial.
% For now, it's just a convenient way to get a set of image filenames.
[data_obj, subject_names, image_names] = load_image_set('emotionreg');
Direct calls to spm_defauts are deprecated.
Please use spm('Defaults',modality) or spm_get_defaults instead.
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Loaded images:
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
% data_obj is an fmri_data object, with its own methods and properties
Load the behavioral data
beh_data_file = which('Wager_2008_emotionreg_behavioral_data.txt')
if isempty(beh_data_file), error('Sample behavioral data not found. '); end
beh = readtable(beh_data_file);
Define the structure with input variables for robfit
Here we fill in fields of a structure, called EXPT, that contain what we need to run the analysis. We'll do this for only a sngle contrast, though we could add more cells to the fields to add others as well.
% The fields of EXPT should be:
% EXPT.SNPM.P Cell vector. Each cell specifies images for one analysis. Each cell contains a string matrix with image names for the analysis. Image files can be 3-D or 4-D images.
% EXPT.SNPM.connames Cell vector. Each cell contains a string with the analysis name (e.g., contrast name) for this contrast
% EXPT.SNPM.connums Vector of contrast numbers. Determines folder names (e.g., 1 = robust0001
% EXPT.cov [n x k] matrix of n observations (must match number of images) empty for 1-sample ttest, or containing covariates
% EXPT.mask Optional mask file image name, for voxels to include
EXPT.subjects = subject_names;
EXPT.SNPM.P{1} = image_names{1}; % string matrix of images names
% Note: In the sample data file, 30 images for 30 subjects are saved in 1
% 4-D .nii file. So we only need to list the name of this one file.
% If you had 3-D image files, this would be a string matrix with 30 rows
EXPT.SNPM.connames = 'Reapp_vs_Look';
EXPT.mask = which('gray_matter_mask.img');
EXPT.cov = beh.Y_Reappraisal_Success;
% Because we have one column in EXPT.cov, our 2nd-level design matrix
% will contain two images (e.g., two t-maps/p-maps):
% One for the intercept (always first) and one for the covariate entered.
Let's look at the input structure we have created:
Create a directory for the results and go to it
This directory will contain subdirectories called robust00XX for each cell containing images entered in EXPT.SNPM.P.
Make sure you are starting in a directory that you can create a folder for the sample results in before you run this.
basedir = fullfile(pwd, 'Robust_regression_sample_results_dir');
if ~exist(basedir, 'dir'), mkdir(basedir), end
% Save the EXPT setup file for later reference:
Run robfit
EXPT = robfit(EXPT);
Robfit.m
____________________________________________________________
Preparing covariates:
Cov 1: Centering and scaling to unit variance. Will be saved in rob_tmap_0002
____________________________________________________________
robfit: Will write to robust???? directories in: /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir
Robfit.m - working on Reapp_vs_Look
Running OLS and IRLS comparison (slower) - to turn this off, use 0 as 3rd input argument.
____________________________________________________________
Using mask image:
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/canlab_canonical_brains/Canonical_brains_surfaces/gray_matter_mask.img
Saving results in : robust0001 (Creating directory)
Minimum allowed observations per voxel:25
Image name for first image:
/Users/f003vz1/Documents/GitHub/CanlabCore/CanlabCore/Sample_datasets/Wager_et_al_2008_Neuron_EmotionReg/Wager_2008_emo_reg_vs_look_neg_contrast_images.nii
Number of images: 1
31078 voxels, 30 planes in analysis
Done: 100% done. writing volumes.
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/nsubjects.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/mask.nii
Writing 4-D weight file: /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/weights.nii
Writing t- and p-images for univariate effects in model.
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/rob_beta_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/rob_tmap_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/rob_p_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/ols_beta_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/ols_tmap_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/ols_p_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/irls-ols_z_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/irls-ols_p_0001.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/rob_beta_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/rob_tmap_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/rob_p_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/ols_beta_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/ols_tmap_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/ols_p_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/irls-ols_z_0002.nii
Writing /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/irls-ols_p_0002.nii
Creating HTML report for results in:
/Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001
Saved HTML report:
/Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/published_output/robust_regression_report_09-Aug-2022_18_58/robust_results_batch.html
% Try help robfit for more options.
% e.g., you can specify that you want to run only some of the contrasts
% in EXPT.SNPM.P. You can specify that you want to create OLS images
% side by side with robust images to compare.
% And you can specify a mask filename.
% The mask image should contain 1’s and 0’s, with 1’s in voxels you want to analyze.
Note: The robfit command will also attempt to run publish_robust_regression_report for each set of images (each analysis) it runs. This generates an HTML report in the results directory and opens it. Thus, if this command runs correctly, it will bring up a report (or multiple reports).
Examine the output files
ls -lt
total 8
drwxr-xr-x 23 f003vz1 staff 736 Aug 9 18:58 robust0001
-rw-r--r-- 1 f003vz1 staff 824 Aug 9 18:57 EXPT.mat
published_output A folder with HTML reports from publish_robust_regression_report
irls-ols_p_0002.nii P-value image for the covariate, difference between robust and OLS
irls-ols_z_0002.nii Z-value image for the covariate, difference between robust and OLS
ols_p_0002.nii P-value image for the covariate, OLS regression
ols_tmap_0002.nii t-value image for the covariate, OLS regression
ols_beta_0002.nii beta (slope) image for the covariate, OLS regression
rob_p_0002.nii P-value image for the covariate, robust regression
rob_tmap_0002.nii t-value image for the covariate, robust regression
rob_beta_0002.nii beta (slope) image for the covariate, robust regression
irls-ols_p_0001.nii P-value image for the intercept, difference between robust and OLS
irls-ols_z_0001.nii The images below are the same as those above, but for the intercept
ols_p_0001.nii
ols_tmap_0001.nii
ols_beta_0001.nii
rob_p_0001.nii
rob_tmap_0001.nii
rob_beta_0001.nii
weights.nii Robust regression weights for each image (i.e., participant)
mask.nii mask of voxels included in the analysis
nsubjects.nii Image with integer values for number of subjects with valid data
SETUP.mat Metadata file
- If you are using CANlab object-oriented tools to threshold and view the results maps, you can load and combine the t- and P-maps into t-statistic image objects for each regressor.
Get batch results
Results should have been saved by publish_robust_regression_report as
HTML. To re-run batch results, go to the robust regression folder and
run the batch script.
You can edit this script to customize it as well.
robust_results_batch
----------------------------------------------
Robust regression
----------------------------------------------
Loading files from /Users/f003vz1/Downloads/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001/Robust_regression_sample_results_dir/robust0001
robust0001
Reapp_vs_Look
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Defaults settings have been modified by file(s):
/Users/f003vz1/Dropbox (Dartmouth College)/Matlab_code_external/spm12/spm_my_defaults.m
Modified fields: stats
Reapp_vs_Look
robust0001
----------------------------------------------
Design Matrix
----------------------------------------------
First regressor (image) is intercept: Yes
Regressors mean-centered (intercept reflects group mean): Yes
Number of regressors (including intercept): 2
Setting up fmridisplay objects
Grouping contiguous voxels: 1 regions
sagittal montage: 1058 voxels displayed, 29070 not displayed on these slices
sagittal montage: 1032 voxels displayed, 29096 not displayed on these slices
sagittal montage: 985 voxels displayed, 29143 not displayed on these slices
axial montage: 9831 voxels displayed, 20297 not displayed on these slices
axial montage: 10506 voxels displayed, 19622 not displayed on these slices
Grouping contiguous voxels: 1 regions
sagittal montage: 1058 voxels displayed, 29070 not displayed on these slices
sagittal montage: 1032 voxels displayed, 29096 not displayed on these slices
sagittal montage: 985 voxels displayed, 29143 not displayed on these slices
axial montage: 9831 voxels displayed, 20297 not displayed on these slices
axial montage: 10506 voxels displayed, 19622 not displayed on these slices