TS2Image
A tool to read EEG data from some well-known file formats and export as images using Gramian Angular Field.
Install / Use
/learn @hvsw/TS2ImageREADME
TS2Image 🌠
This is the implementation of my work presented in partial fulfillment of the requirements for the degree of Bachelor in Computer Engineering: TS2Image: a software to convert EEG time series into images for training brain-computer interface convolutional neural networks
A tool to read EEG data from GDF files and export as images using Gramian Angular Field or ERSP.
Setup
Dependencies
TS2Image was developed using Python 3.8.2, that you can download and install from official site.
To install all third party dependencies run the following in your terminal:
pip -r install requirements.txt
Configuration
- Set the directory containing the files you want to process:
input_folder = current_working_directory + "/datasets"
- Set the root output folder:
output_folder = current_working_directory + '/images-output'
- Create according to your dataset. For example, these are the events from some files in BCI IV competition dataset:
DESCRIPTION_EYES_OPEN = "276"
DESCRIPTION_EYES_CLOSED = "277"
DESCRIPTION_START_TRIAL = "768"
DESCRIPTION_CUE_LEFT = "769"
DESCRIPTION_CUE_RIGHT = "770"
DESCRIPTION_BCI_FEEDBACK = "781"
DESCRIPTION_CUE_UNKNOWN = "783"
DESCRIPTION_REJECTED_TRIAL = "1023"
DESCRIPTION_UNKNOWN_GROUP = "1072"
DESCRIPTION_EYE_MOVEMENT_HORIZONTAL = "1077"
DESCRIPTION_EYE_MOVEMENT_VERTICAL = "1078"
DESCRIPTION_EYE_ROTATION = "1079"
DESCRIPTION_EYE_BLINK = "1081"
DESCRIPTION_START_NEW_RUN = "32766"
BCI_competition_dataset_events_dictionary = {
DESCRIPTION_EYES_OPEN:'Idling EEG (eyes open)',
DESCRIPTION_EYES_CLOSED:'Idling EEG (eyes closed)',
DESCRIPTION_START_TRIAL:'Start of a trial',
DESCRIPTION_CUE_LEFT:'Cue onset left (class 1)',
DESCRIPTION_CUE_RIGHT:'Cue onset right (class 2)',
DESCRIPTION_BCI_FEEDBACK:'BCI feedback (continuous)',
DESCRIPTION_CUE_UNKNOWN:'Cue unknown',
DESCRIPTION_REJECTED_TRIAL:'Rejected trial',
DESCRIPTION_UNKNOWN_GROUP:'Unkown Group',
DESCRIPTION_EYE_MOVEMENT_HORIZONTAL:'Horizontal eye movement',
DESCRIPTION_EYE_MOVEMENT_VERTICAL:'Vertical eye movement',
DESCRIPTION_EYE_ROTATION:'Eye rotation',
DESCRIPTION_EYE_BLINK:'Eye blinks',
DESCRIPTION_START_NEW_RUN:'Start of a new run'
}
- List of events from
events_dictionaryyou want to export from your dataset:
valid_events_descriptions = [DESCRIPTION_CUE_LEFT, DESCRIPTION_CUE_RIGHT]
- Start time window padding, in seconds. Negative values are accepted:
t_start = 0
- Time window length, in seconds:
duration = 4
References
Related Skills
YC-Killer
2.7kA library of enterprise-grade AI agents designed to democratize artificial intelligence and provide free, open-source alternatives to overvalued Y Combinator startups. If you are excited about democratizing AI access & AI agents, please star ⭐️ this repository and use the link in the readme to join our open source AI research team.
best-practices-researcher
The most comprehensive Claude Code skills registry | Web Search: https://skills-registry-web.vercel.app
groundhog
398Groundhog's primary purpose is to teach people how Cursor and all these other coding agents work under the hood. If you understand how these coding assistants work from first principles, then you can drive these tools harder (or perhaps make your own!).
isf-agent
a repo for an agent that helps researchers apply for isf funding
