Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
The data stored here have previously been used in a project comparing the representation of multiple objects between the parietal and ventral visual streams. Details of the fMRI data can be found in this publication: Jeong, S. K., & Xu, Y. (2017). Task-context-dependent linear representation of multiple visual objects in human parietal cortex. Journal of Cognitive Neuroscience, 29(10), 1778-1789. (link) The following project has used this data set: Mocz, V., Jeong, S. K., Chun, M., & Xu, Y. (2023) Representing Multiple Visual Objects in the Human Brain and Convolutional Neural Networks. (in prep) There are three folders here, one for the code of the experiments, one for the experiment data, and one for figures of the experiments. I go through the code and experimental data folders in more detail: (1) Experimental Data, 3 subfolders - images: these are the images that were presented to both CNNs and human participants in the fMRI scanner. There are four object categories: guitar, shoe, couch, and bicycle. The objects can appear either on the top or the bottom of the image, and they can be presented alone or in pairs. - fMRI_data: these are the responses from all voxels in LO across the different conditions of the experiment. The size of a matrix might look something like 10 x 20 x 136, where 10 is the number of runs in the fMRI experiment, 20 is the number of image conditions the participant viewed (e.g., bike on top alone, or couch on top + guitar bottom), and 136 is the number of voxels in LO for that participant. The order of the 20 conditions is as follows: shoeTop, shoeBottom, bikeTop, bikeBottom, couchTop, couchBottom, guitarTop, guitarBottom, shoeTopBikeBottom, shoeBottomBikeTop, shoeTopCouchBottom, shoeBottomCouchTop, shoeTopGuitarBottom, shoeBottomGuitarTop, bikeTopCouchBottom, bikeBottomCouchTop, bikeTopGuitarBottom, bikeBottomGuitarTop, couchTopGuitarBottom, couchBottomGuitarTop. - CNN_data: these are responses from all units of Cornet-S (link) and three different versions of Resnet-50 trained on stylized ImageNet images (link). These CNNs were developed in PyTorch, while the other pretrained CNNs that were also analyzed (Alexnet, Googlenet, VGG-19, original Resnet-50) are built into MATLAB (which the anlysis code was written in), so only the data for the PyTorch models are included in this folder. Each file is named in the form "CNN Name_CNN Layer_Image Name.npy", which is the response of all the units in a specific CNN layer to an image. The size of the matrix might look something like 1 x 512, where 512 is the number of units in the CNN layer. (2) Code, 2 subfolders - fMRI: pattern_analysis.py, as the name suggests, analyzes the correlation of the entire pattern between averages of single objects and object pairs. pattern_analysis_near_far.py is a similar analysis using voxels where the slope in response amplitude averaging for the single voxel unit was near or far from 0.5, with the near condition defined as having a slope between 0.45 and 0.55 and the far condition defined as having a slope less than 0.45 or greater than 0.55. unit_analysis.py, as the name suggests, analyzes the slope of individual voxels between average of single objects and object pairs. There is also a utilities python file which contains helper functions that are used in the other python files. sublist.txt includes the name of the files for the participant fMRI data that are used in the python analysis, in the form of participant_01, participant_02, etc. - CNNs: pattern_analysis_cornet.m is a similar pattern analysis as in the fMRI data, but for the Cornet-S CNN. pattern_analysis_resnetSIN.m is the pattern analysis for the three different versions of Restnet-50 trained on stylized ImageNet. pattern_analysis_all_other_CNNs.m is the pattern analysis for the CNNs Alexnet, Googlenet, VGG-19, and the original Resnet-50. unit_analysis.py is a similar individual unit analysis as with the fMRI data, but in CNNs. There is again a utilities python file that contains helper functions for the python code.
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.