Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
# As simple as possible This is a repo sharing some of the data published in: _Visual prototypes in the ventral stream are attuned to complexity and gaze behaviour_ authored by Rose O., Johnson J.K., Wang B. and Ponce C.R.* and published in _Nature Communications_ (date TBA, end of 2021) The title is in reference to one of our central findings that highlights the strength of using neuron-guided image synthesis instead of using hand-curated sets of visual stimuli to probe neural responses. We were able to show that these hand-curated sets are too simplistic. The images the neurons synthesized had higher part-complexity and reconstruction-complexity than tradition stimulus sets but less complexity that photographs, and less than our image synthesis algorithm is capable of making. This implies that the features these neurons are processing must be both rich and sufficiently flexible to allow various kinds of patterns to be decoded, but not so rich that it overfits incidental features of the visual world; to paraphrase, the code must be “as simple as possible, but not simpler.” ~Roger sessions Here we share the latent vectors used for the image synthesis experiment and the responses to the images generated from those latent vectors, along with sufficient metadata to reproduce most of the results in the publication. We do not share the gaze behavior data, nor the baseline reference images due to concerns about privacy and distribution of possibly copyrighted material. However the article contains procedures for researchers to request this material. We have a gitHub repository for sharing examples of how to implement the key methods of the paper. <br> ## See: [PonceLab/as-simple-as-possible]("https://github.com/PonceLab/as-simple-as-possible") <br> <br> <br> #### Here is an explanation of the data objects NOTE: this is a partial share, the rest of the data will be made public by XXXXX Also find an enclosed Matlab Livescript covering this again, along with an example of loading the data and generating an figure for the paper. [Example loading data https://osf.io/mu9hr/]("https://osf.io/mu9hr/") To generate images from the latent vectors save here use this software: [willwx/XDream]("https://github.com/willwx/XDream") prefChan -> Channel number across all arrays isLiveChan -> Whether the channel is visually responsive rateChange_syn_nat_nonParametric -> mean of last 10 generation response minus first generation responses bootstrapped 500 times dimensions: 1x[synthesized images, baseline reference images]x[mean of bootstrap, std of bootstrap] ephysFN -> Name of raw ephys file (not shared, useful for double checking file order, unique to this entry) iChan_in_rasters -> Channel index in Plexon's spike sorted data file genesAll -> All the genes of all images synthesized. Use the following software to synthesize images: https://github.com/willwx/XDream gen -> The generation corresponding to genes in genesAll (same order) imagePosition -> image position in degrees of visual field imageSize -> image size in defrees of visual field pixPerDeg -> screen resolution mask -> A mask showing the estimated strength of visual responses across the image area (estimated separately) evoConfiguration -> Configuration details for the custom experimental MonkeyLogic software resp -> Mean over the image presentation of a Gaussian filter applied to boolean spike events resp_stimuli -> Strings corresponding the images presented in order of presentation. Some genes from last generation are not presented, because the experiment is terminated in the last generation. Naming system is: block082_gen_gen000_000019 -> "20th image synthesized for the 83rd generation", generation 0 is sampled from a standard set. The images are not shared. generatorInfo -> A string identifying which image synthesis algorithm was used refPics -> Names of images used for the references (like used in "rateChange_syn_nat_nonParametric") The images are not shared. fileNdx -> Index corresponding to the animal used (1-> A, 2-> B) iExp -> Index corresponding to this file (unique numerical ID) picLoc -> Location of where synthetic images were stored resp2gene -> The responses to each gene in genesAll picSaveErrors -> Errors when validating that "resp_stimuli" corresponds to the same ordering in "resp" and "genesAll" picGeneRsquare -> Pixelwise Rsquare between the image saved to disk (resp_stimuli) and subsequent re-generation from "genesAll" areaNdx -> The numerical index indicating what anatomical region the electrode channel was in areaNames -> The alphanumeric code of the brain region V1/V2 or V4 or IT
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.