Orientation-Based Interaction on Mobile Devices and Desktops - An Evaluation

Article Preview

Abstract:

Nowadays, mobile devices, such as smartphones, tablets or smartwatches, are essential items in our daily life. Further, more and more people use smart mobile devices in their everyday work for remote controlling, observing diagrams, performing web analytics, etc. However, the full potential of mobile devices is not tapped yet; built-in sensors such as accelerometers or gyroscopes offer a wide range of interaction capabilities, which are still often not fully used in nowadays mobile applications. On the other hand, desktops are still the dominating working device, but with significantly differing interaction means. With additional hand tracking devices capturing the user's gestures additional input possibilities are available but still often unused. In this paper, we investigate on a concept for orientation-based touch-less interaction. Depending on the type of device - traditional desktop or a mobile device - we use an interaction metaphor called "Waggle"; utilizing of tilting and turning of either the user's hand or the mobile device itself for additional input possibilities. Based on the results of two pilot studies for both environments, basic parameters for future design decisions are derived: on the one hand, the maximum angle for basic rotation axes are evaluated. On the other hand, different discretizations of tilt and turn angles are investigated. Based on the outcome of both studies the optimal configuration for the use of the Waggle interaction metaphor in future applications on both mobile and desktop environments are defined.

You have full access to the following eBook

Info:

* - Corresponding Author

[1] D. Bachmann, F. Weichert, and G. Rinkenauer. Evaluation of the leap motion controller as a new contact-free pointing device. Sensors (Basel, Switzerland), 15(1): 214-233, (2015).

DOI: 10.3390/s150100214

Google Scholar

[2] M. Baglioni, E. Lecolinet, and Y. Guiard. Jerktilts: Using accelerometers for eight-choice selection on mobile devices. In Proceedings of the 13th International Conference on Multimodal Interfaces, ICMI '11, pages 121-128, New York, NY, USA, 2011. ACM.

DOI: 10.1145/2070481.2070503

Google Scholar

[3] Ergo Vancouver. Wrist movements, 2017 (accessed February 06, 2017). http: /www. ergovancouver. net/wrist_movements. htm.

Google Scholar

[4] J. Guna, G. Jakus, M. Pogačnik, S. Tomažič, and J. Sodnik. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors (Switzerland), 14(2): 3702-3720, (2014).

DOI: 10.3390/s140203702

Google Scholar

[5] G. Jakus, J. Guna, S. Tomažič, and J. Sodnik. Evaluation of leap motion controller with a high precision optical tracking system. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8511 LNCS(PART 2): 254-263, (2014).

DOI: 10.1007/978-3-319-07230-2_25

Google Scholar

[6] Leap Motion, Inc. Leapmotion, 2017 (accessed February 06, 2017). https: /www. leapmotion. com.

Google Scholar

[7] G. Marin, F. Dominio, and P. Zanuttigh. Hand gesture recognition with leap motion and kinect devices. 2014 IEEE International Conference on Image Processing, ICIP 2014, pages 1565- 1569, (2014).

DOI: 10.1109/icip.2014.7025313

Google Scholar

[8] K. Partridge, S. Chatterjee, V. Sazawal, G. Borriello, and R. Want. Tilttype: Accelerometersupported text entry for very small devices. In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology, UIST '02, pages 201-204, New York, NY, USA, 2002. ACM.

DOI: 10.1145/571985.572013

Google Scholar

[9] L. E. Potter, J. Araullo, and L. Carter. The Leap Motion controller. Proceedings of the 25th Australian Computer-Human Interaction Conference on Augmentation, Application, Innovation, Collaboration - OzCHI '13, (February 2016): 175-178, (2013).

DOI: 10.1145/2541016.2541072

Google Scholar

[10] M. Rahman, S. Gustafson, P. Irani, and S. Subramanian. Tilt techniques: Investigating the dexterity of wrist-based input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '09, pages 1943-1952, New York, NY, USA, 2009. ACM.

DOI: 10.1145/1518701.1518997

Google Scholar

[11] J. Rekimoto. Tilting operations for small screen interfaces. In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, UIST '96, pages 167-168, New York, NY, USA, 1996. ACM.

DOI: 10.1145/237091.237115

Google Scholar

[12] A. Roudaut, M. Baglioni, and E. Lecolinet. TimeTilt: Using Sensor-Based Gestures to Travel through Multiple Applications on a Mobile Device, pages 830-834. Springer Berlin Heidelberg, Berlin, Heidelberg, (2009).

DOI: 10.1007/978-3-642-03655-2_90

Google Scholar

[13] J. Ryu, W. P. Cooney, L. J. Askew, K. -N. An, and E. Y. Chao. Functional ranges of motion of the wrist joint. The Journal of Hand Surgery, 16(3): 409 - 419, (1991).

DOI: 10.1016/0363-5023(91)90006-w

Google Scholar

[14] J. Schwank, F. -A. Rupprecht, and A. Ebert. Waggle - orientation-based tablet interaction. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA '17, New York, NY, USA, 2017. ACM. to be published.

DOI: 10.1145/3027063.3053151

Google Scholar

[15] A. Scoditti, R. Blanch, and J. Coutaz. A novel taxonomy for gestural interaction techniques based on accelerometers. In Proceedings of the 16th International Conference on Intelligent User Interfaces, IUI '11, pages 63-72, New York, NY, USA, 2011. ACM.

DOI: 10.1145/1943403.1943414

Google Scholar

[16] M. C. B. Seixas, J. C. S. Cardoso, and M. T. G. Dias. One Hand or Two Hands? 2D Selection Tasks With the Leap Motion Device. ACHI 2015 : The Eighth International Conference on Advances in Computer-Human Interactions, (c): 33-38, (2015).

Google Scholar

[17] M. C. B. Seixas, J. C. S. Cardoso, and M. T. G. Dias. The Leap Motion Movement for 2D Pointing Tasks - Characterisation and Comparison to Other Devices. Proceedings of the 5th International Conference on Pervasive and Embedded Computing and Communication Systems, pages 15-24, (2015).

DOI: 10.5220/0005206100150024

Google Scholar

[18] N. Shariatzadeh, G. Sivard, and D. Chen. Software evaluation criteria for rapid factory layout planning, design and simulation. Procedia CIRP, 3: 299-304, (2012).

DOI: 10.1016/j.procir.2012.07.052

Google Scholar

[19] J. Shen, Y. Luo, Z. Wu, Y. Tian, and Q. Deng. CUDA-based real-time hand gesture interaction and visualization for CT volume dataset using leap motion. Visual Computer, 32(3): 359-370, (2016).

DOI: 10.1007/s00371-016-1209-0

Google Scholar

[20] T. Tolio, D. Ceglarek, H. ElMaraghy, A. Fischer, S. Hu, L. Laperrière, S. T. Newman, and J. Váncza. Species-co-evolution of products, processes and production systems. CIRP AnnalsManufacturing Technology, 59(2): 672-693, (2010).

DOI: 10.1016/j.cirp.2010.05.008

Google Scholar

[21] F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler. Analysis of the accuracy and robustness of the Leap Motion Controller. Sensors (Switzerland), 13(5): 6380-6393, (2013).

DOI: 10.3390/s130506380

Google Scholar

[22] C. Weidig, P. Galambos, Á. Csapó, P. Zentay, P. Baranyi, J. C. Aurich, B. Hamann, and O. Kreylos. Future internet-based collaboration in factory planning. Acta Polytechnica Hungarica, 11(7), (2014).

DOI: 10.12700/aph.11.07.2014.07.10

Google Scholar

[23] Wikipedia. Lenticular Lens, 2016 (accessed March 29, 2016). https: /en. wikipedia. org/wiki/ Lenticular_lens.

Google Scholar