- create_meg_dataset.ipynb
Run the fileMEG_preprocessing.ipynbafter pointing the variables specifying the path to the MEG dataset.
- fmri_dimensionality_reduction.ipynb
To create the dimensionality-reduced fMRI data, run the scriptfmri_dimensionality_reduction.py. Specify the dir_name and the path the fMRI data in the script, and pass the sub ('01', '02', '03') as an argument.
- create_meg_fmri_subset.ipynb
- create_meg_fmri_dataset.ipynb
To create the train test splits of the MEG dataset, specify the paths to the preprocessed MEG images and the image embeddings in cells 3, 4 and 7 and run the rest of the cells. The train and test datasets will be created in filesvalid_epochs_small_train_resplit.pickleandvalid_epochs_small_test_resplit.pickle, and the corresponding RGB embeddings will be created in filesimage_embeddings_vit_small_train_resplit.npyandimage_embeddings_vit_test_small_resplit.npyas dictionaries. Similarly, use the filessrc/preprocessing/create_meg_fmri_subset.ipynbandsrc/preprocessing/create_meg_fmri_combined.ipynbto create the subset of MEG images and the combination of MEG and fMRI images for the multimodal experiments.
- regression_fmri.ipynb
- regression_meg.ipynb
- regression_meg_fmri_subset.ipynb
- regression_meg_fmri_combined.ipynb
Set the appropriate paths from preprocessing in the starting cells, then run the file regression_meg.ipynb for linear regression on the MEG dataset, and the files regression_meg_fmri_subset.ipynb, regression_fmri.ipynb, and regression_meg_fmri_combined.ipynb for linear regression on the MEG subset, fMRI images and combined data respectively.
The paths to the preprocessed MEG embeddings from the original dataset have to be set in cell 2 and the path to the tsv file containing the high-level THINGS categories has to be set in cell 11. The rest of the cells can be run as is.
This script trains a dilated residual convnet architecture for MEG data using the SimpleConv architecture. The script includes configurable options for dataset type, preprocessing method, loss functions, and more. To run the script, ensure all necessary Python packages (e.g., PyTorch, NumPy, MNE, WandB) are installed, and use the following command to execute the script with your desired configuration:
python train_convnet.py --epochs 100 --batch_size 128 --lr 3e-4 --warmup_lr 1e-6 --warmup_interval 1000 --output_dir ./output --save_interval 50 --print_interval 150 --wandb_project MEG_Project --early_stopping 4 --dropout 0.3 --dilation_type expo --embeddings_type vit --loss_func soft_clip_lossThis script trains a custom vision transformer that takes MEG data as input. To run the training script, ensure all necessary Python packages (e.g., PyTorch, NumPy, MNE, WandB) are installed, and use the command below with the desired arguments for the various hyperparameters:
python -u train_vit.py --epochs 100 --batch_size 128 --lr 3e-4 --warmup_lr 1e-6 --warmup_interval 1000 --output_dir ./output --save_interval 50 --print_interval 150 --wandb_project MEG_ViT_trial --early_stopping 4 --hidden_dropout 0.1 --attention_dropout_prob 0.0 --num_hidden_layers 4 --num_attention_heads 4 --hidden_size 64 --meg_channels 270 --patch_width 2 --embeddings_type dino --loss_func soft_clip_lossTo run the diffusion pipeline, modify the PATH variables in the Jupyter script: