Abstract from the related paper:
> Despite the importance of usability in human-machine interaction (HMI), most commonly
used devices are not usable by all potential users. In particular, users with low or null
technological experience, or with special needs, require carefully designed systems
and easy-to-use interfaces supporting recognition over recall. To this purpose, Natural
User Interfaces (NUIs) represent an effective strategy as the user’s learning is facilitated
by features of the interface that mimic the human “natural” sensorimotor embodied
interactions with the environment. This paper compares the usability of a new NUI (based
on an eye-tracker and hand gesture recognition) with a traditional interface (keyboard)
for the distal control of a simulated drone flying in a virtual environment. The whole
interface relies on “dAIsy”, a new software allowing the flexible use of different input
devices and the control of different robotic platforms. Each of the 59 users involved in the
study was required to complete two tasks with each interface, while their performance
was recorded: (a) exploration: detecting trees embedded in an urban environment; (b)
accuracy: guiding the drone as accurately and fast as possible along a predefined
track. Then they were administered questionnaires regarding the user’s background, the
perceived embodiment of the device, and the perceived quality of the virtual experience
while either using NUI or the traditional interface. The results appear controversial and
call for further investigation: (a) contrary to our hypothesis, the specific NUI used led to
lower performance than the traditional interface; (b) however, the NUI was evaluated as
more natural and embodied. The final part of the paper discusses the possible causes
underlying these results that suggest possible future improvements of the NUI.