Main content

Home

Menu

Loading wiki pages...

View
Wiki Version:
**Body Orientation Illustration** This repository contains data and code that reproduces figures from the paper [Sosa-León, V., Schwering, A., (2021): Detecting social spaces with depth cameras: evaluating location and body orientation as relevant social features. IPIN 2021: 11th International Conference on Indoor Positioning and Indoor Navigation.](https://www.researchgate.net/publication/356784002_Detecting_socially_occupied_spaces_with_depth_cameras_evaluating_location_and_body_orientation_as_relevant_social_features) **Usage** There are 2 main ways in which you can use this repository: - Download the repository. You can open locally in your Jupyter Notebook installation the file 'BodyOrientations.ipynb' by clicking it on the file list. The code is divided into sections with comments. This is likely to fail in the future when software versions change. - You can click on [![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/violetasdev/bodyorientation_example/HEAD), this will open an interactive session of Jupyter Notebooks in your web browser. You can then run or update the code. Rememeber any changes to the code will be lost as soon as you close the browser window. In general, to use the **Jupyter Notebook**: - Wait until the interface is loaded with the Jupyter Notebook interface. - In the left panel is possible to access the files. The **data** folder contains the .csv file with the body orientation dataset. - From the left panel, select the notebook with the extension BodyOrientations.ipynb - In the right panel, you will find an interactive interface to run the code. Start from the first cell and run every line with the play button. **About** We propose a system that uses infrared depth cameras to anonymously derivate body orientation from skeleton joints. The orientation of the shoulders and spine, together with the face orientation and the temporal information on occupants' walking trajectories, is used to calculate the body orientation from which socially occupied space is identified. In a user study evaluating the system, we collected data in 32 patterns within two distinct cases: individuals and dyads and evaluated the system's accuracy. The evaluation included the intended orientation and the socially accepted orientation. Our algorithm to detect body orientation contributes to the automated detection of socially occupied spaces. **License** MIT License
OSF does not support the use of Internet Explorer. For optimal performance, please switch to another browser.
Accept
This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.
Accept
×

Start managing your projects on the OSF today.

Free and easy to use, the Open Science Framework supports the entire research lifecycle: planning, execution, reporting, archiving, and discovery.