Atia, AymanTakahashi, ShinMisue, KazuoTanaka, Jiro2020-02-192020-02-192009978-3-642-02576-1https://doi.org/10.1007/978-3-642-02577-8_16https://cutt.ly/mrMu0gtMSA Google ScholarOne of the main challenges of interaction in a ubiquitous environment is the use of hand gestures for interacting with day to day applications. This interaction may be negatively affected due to the change in the user’s position, interaction device, or the level of social acceptance of a specific hand gesture. We present UbiGesture as architecture for developers and users who frequently change locations while interacting in ubiquitous environments. The architecture enables applications to be operated by using hand gestures. Normal users can customize their own hand gestures when interacting with computers in context-aware ubiquitous environments. UbiGesture is based on combining user preferences, location, input/output devices, applications, and hand gestures into one profile. A prototype implementation application for UbiGesture is presented. Then a subjective and objective primary evaluation for UbiGesture while interacting in different locations with different hand gesture profiles is presentedenOctober University for University for Ubiquitous environmentHand gesture profilesContext aware servicesUbiGesture: Customizing and Profiling Hand Gestures in Ubiquitous EnvironmentBook chapterhttps://doi.org/10.1007/978-3-642-02577-8_16