<p>The original dataset <em>VerSe 2019</em> (<a href="https://doi.org/10.1148/ryai.2020190138" rel="nofollow">https://doi.org/10.1148/ryai.2020190138</a>)
- includes 160 CT image series of 141 patients with segmentation
masks of 1725 vertebrae.
- is split into training (80), validation (40), and test set (40).
- was prepared for a vertebral labeling and segmentation challenge hosted at the 2019 International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI).</p>
<p>Please read the readme.txt and licence.txt: <a href="https://osf.io/pg6s3/" rel="nofollow">https://osf.io/pg6s3/</a> (Creative Commons
<p>This dataset is available in two different ways:
1. <strong>Image series based (MICCAI):</strong> 160 image series of 141 patients are divided into a training (n=80), validation (n=40), and test (n=40) set as originally published for the MICCAI challenge.
2. <strong>Subject based:</strong> 141 patients holding 160 image series are divided into a training (n=67), validation (n=37), and test (n=37) set. Subject identifiers equal image series identifiers ('verseXXX'). For patient with 2 or 3 image series, new subject identifiers ('verse4XX') were introduced. This is an adaption of the Brain Imaging Data Structure (BIDS; <a href="https://bids.neuroimaging.io/" rel="nofollow">https://bids.neuroimaging.io/</a>).</p>
<p>We recommend to use the subject-based format, as this is completely consistent in VerSe 2019 and VerSe 2020. Please note, that the VerSe 2019 cases that were re-used in the VerSe 2020 challenge have to be downloaded from the subject-based VerSe 2019 repository and are not again included in the subject-based VerSe 2020 repository. Additionally, we offer python scripts to easily work with the subject-based data-structure here: <a href="https://github.com/anjany/verse" rel="nofollow">https://github.com/anjany/verse</a></p>
<p>Please respect our work, as we spent >2 years for algorithmic development and >2000 working hours for manual corrections of segmentation masks. </p>
<p><strong>By downloading this data you agreed to cite these papers in your work:</strong></p>
<li>Löffler M, Sekuboyina A, Jakob A, Grau AL, Scharr A, Husseini ME, Herbell M, Zimmer C, Baum T, Kirschke JS., A Vertebral Segmentation Dataset with Fracture Grading. Radiology: Artificial Intelligence, 2020 <a href="https://doi.org/10.1148/ryai.2020190138" rel="nofollow">https://doi.org/10.1148/ryai.2020190138</a>. </li>
<li>Sekuboyina A. et al., Labelling Vertebrae with 2D Reformations of Multidetector CT Images: An Adversarial Approach for Incorporating Prior Knowledge of Spine Anatomy. Radiology: Artificial Intelligence, 2020, <a href="https://doi.org/10.1148/ryai.2020190074" rel="nofollow">https://doi.org/10.1148/ryai.2020190074</a>. </li>
<li>Sekuboyina A, Bayat AH, Husseini ME, Löffler M, Menze BM, ..., Kirschke JS. VerSe: A Vertebrae Labelling and Segmentation Benchmark. <a href="https://arxiv.org/abs/2001.09193" rel="nofollow">https://arxiv.org/abs/2001.09193</a></li>
<p>An overwiew of the data is provided in reference 1. The methods to generate the initial segmentations that were manually corrected afterwards are detailed in reference 2 and 3.</p>
<p>This work has been supported by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 637164 — iBack — ERC-2014-STG).</p>
<p>Scans and segementation masks are stored in NIFTI format (<a href="https://nifti.nimh.nih.gov/" rel="nofollow">https://nifti.nimh.nih.gov/</a>). Coordinates of vertebral body centroids per vertebral level are stored in JSON format. Please refer to the Component Wiki Pages for further details.</p>
<h2>Licence and Ethics</h2>
<p>The data is published under the licence CC BY-SA 2.0 (see licence.txt). When using the data you must cite the three papers mentioned above.</p>
<p>Ethical approval to publish this data has been obtained from the local ethics committee at the Technical University of Munich (Proposal 27/19 S-SR).</p>
<p>The follow-up of this project is available here: <a href="https://osf.io/t98fz/" rel="nofollow">https://osf.io/t98fz/</a></p>