Skip to content

mslee2129/imperial_thesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Imperial Thesis

About

This is a GitHub repository for my Individual Project for the MSc Computing degree at Imperial College London. This project attempts to reconstruct musical stimuli from EEG signals using generative models. The report can be accessed here here.

Dataset

The dataset used for training can be found here (NMED-T) and here (Film Music). Save the dataset in ~/data/

Preprocessing

There are two preprocessing pipelines. For preprocessing the Film Music dataset from scratch, use the MATLAB scripts in code/preprocessing/matlab. For preprocessing and segmenting the NMED-T dataset for training preparation, use files in code/preprocessing/python. The preprocessed data should be saved under data/nmed-t-prep/

Models

There are three models that can be trained: CNN, cGAN and cCGAN. These can be found in code/model/models

Training/Testing

Training script example:

python ./code/model/train.py --dataroot ./data/nmed-t-prep --name cgan_resnet_9_batch_8 --gpu_ids 0 --model pix2pix --netG resnet_9blocks  --dataset_mode supervised --display_id -1 --input_nc 1 --output_nc 1 --batch_size 8 --netD basic

Set

--label_smoothing

to traing with label smoothing

Citation

The code for the GANs has been largely inspired from the original pix2pix/cycleGAN paper.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published