Signs in sign languages have been primarily analyzed as composed of three main formational elements: handshape, location, and movement. Measuring the variation in any of these formal elements between different signs and sign languages requires an extensive manual annotation of each feature based on predefined possible formations. Such a process can be time-consuming and it is based on abstract representations that cannot take into account the intricacies of the different signers.
In my presentation, I showcase my newly developed tool to measure and visualize variation in movement and location between different semantic fields of two sign languages, namely Ghanaian Sign Language and American Sign Language. The tool utilizes a pre-trained pose estimation framework by Cao et al. [1] to track the body joints of different signers. Subsequently, an algorithm that measures the similarity between two temporal sequences named Dynamic Time Warping is used to quantify the variation of the paths of the dominant hand between all the different signs. By averaging all the positions of the dominant hand’s wrist for each sign within a semantic field, the tool is able to visualize the average location feature. Finally, the results produced by the tool are compared with manual annotations in order to validate the overall output.