Large scale Visual Short Term Memory - Ulloa and Horwitz 2016

Large-scale neural models (LSNMs) link together neuroscience data from different temporal and spatial scales to investigate the neural mechanisms responsible for carrying out cognitive tasks. The availability of structural connectivity maps of the whole brain (i.e., connectomes) offers an opportunity to incorporate macro-scale level information into LSNMs. In this work, we demonstrate how to merge LSNMs with a connectome to simulate multi-subject short-term memory neuroimaging experiments. We provide a worked example of our approach (Ulloa and Horwitz, 2016) using a preexisting visual short-term memory model.

Scientific Coordinator: Antonio Ulloa

This model was originally developed in: Python

The code for this model is hosted on GitHub: