A comprehensive visual stimulus presentation system for neuroscience experiments, designed to coordinate visual stimulation with various data acquisition devices.
This system enables precise control and synchronization of:
- Visual stimulus presentation on calibrated displays
- Two-photon microscope triggering
- Eye-tracking camera control
- Body-tracking camera control
- TTL signal collection for post-hoc data synchronization
-
Clone the repository:
git clone [repository-url]
-
Create and activate a conda environment using the provided environment file:
conda env create -f simple-vs_env.yaml conda activate simple-vs
-
Hardware setup:
- Connect your National Instruments/PCO DAQ device
- Configure your display monitor
- Connect your tracking cameras and microscope triggers
- Multiple experiment paradigms:
- Retinotopy mapping
- Texture analysis
- Locally sparse noise
- Dynamic battery
- Simple orientation
- Custom experiments via extensible base classes
- Flexible configuration via YAML files for:
- Monitor settings and calibration
- Experiment parameters
- Data saving preferences
- Robust data acquisition (DAQ) support:
- National Instruments
- PCO
- Other DAQ systems via extensible interfaces
- psychopy - Visual stimulus presentation
- nidaqmx - National Instruments DAQ interface
- numpy - Numerical computations
- pyaml - Configuration file parsing
- h5py - Data storage
- pandas - Data analysis and logging
BaseExperiment.py
- Core experiment infrastructure*Experiment.py
files - Specific experiment implementations*DAQ.py
files - Data acquisition interfaces*_config.yaml
files - Configuration templates
- Configure your experiment parameters in the appropriate YAML file
- Set up your monitor configuration in
monitor_config.yaml
- Configure data saving preferences in
save_settings_config.yaml
- Run your experiment using the corresponding runner script (e.g.,
run_TEX_experiment.py
)
-
Edit
texture_FB-VGG_config.yaml
to set your desired parameters:name: texture_FB-VGG experiment_delay: 5 # ... other parameters
-
In
main.py
, uncomment the desired experiment (current manual selection process):# Comment out other experiments and uncomment: experiment = TextureExperimentFBVGG( experiment_id=experiment_id, mouse_id=mouse_id, daq=daq, monitor_config_filename='monitor_config.yaml', save_settings_config_filename='save_settings_config.yaml', exp_config_filename='texture_FB-VGG_config.yaml', debug=bool_DEBUG )
-
Run the experiment:
python main.py
- Configure
retinotopy_config.yaml
- In
main.py
, uncomment the RetinotopyExperiment section - Run:
python main.py
Note: The current experiment selection process requires manually editing main.py
to uncomment the desired experiment. Future versions may implement a command-line argument or configuration-based selection system.
This project is maintained by the Visual Neuroscience Group at UBC. For questions or contributions, please contact the repository maintainers.