Gesture-Based Control of an OMRON Viper 650 Robot

Article Preview

Abstract:

This project focuses on developing and implementing a robotic control system based on detecting signs and gestures using computer vision. The main goal was to create an intuitive and efficient interface for interacting with an OMRON Viper 650 industrial robot. To achieve this, computer vision technologies like Mediapipe and OpenCV were used to detect and recognize the user’s hands and fingers in real-time. The collected data was processed with a Python script and stored in a text file. Additionally, a program was developed in C# using OMRON’s ACE programming interface to extract data from the text file and send commands to the Viper 650 robot, enabling it to interpret the user’s gestures and perform actions accordingly. This project has successfully created an innovative solution that combines computer vision, programming, and industrial robotics to provide an intuitive and efficient control experience, opening up new possibilities in industrial and human-robot interaction applications.

You might also be interested in these eBooks

Info:

Periodical:

Engineering Headway (Volume 12)

Pages:

49-59

Citation:

Online since:

September 2024

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2024 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] C. B. Frey and M. A. Osborne, The future of employment: how susceptible are jobs to computerisation?, Technological Forecasting and Social Change, 114, pp.254-280, 2017

DOI: 10.1016/j.techfore.2016.08.019

Google Scholar

[2] J. A. V. Fossen, C. Chang, J. K. Ford, E. A. Mack, and S. R. Cotten, Identifying alternative occupations for truck drivers displaced due to autonomous vehicles by leveraging the o*net database, American Behavioral Scientist, 67(14), pp.1693-1715, 2022. doi:10.1177/00027642221127239[3] J. Badet, Ai, automation and new jobs, Open Journal of Business and Management, 09(05), pp.2452-2463, 2021

DOI: 10.4236/ojbm.2021.95132

Google Scholar

[4] A. Russo, Recession and automation changes our future of work, but there are jobs coming, report says, World Economic Forum, 2020. Available at: https://www.weforum.org/press/2020/10/recession-and-automation-changes-our-future-of-work-but-thereare-jobs- coming-report-says-52c5162fce/(Accessed: 24 April 2024)

Google Scholar

[5] E. Kim et al., EMG-based dynamic hand gesture recognition using edge AI for Human-Robot Interaction, Electronics, 12(7), p.1541, 2023

DOI: 10.3390/electronics12071541

Google Scholar

[6] S. A. Khajone, Dr. S. Mohod, and V. M. Harne, Implementation of a wireless gesture controlled robotic arm, International Journal of Innovative Research in Computer and Communication Engineering, 03(01), pp.375-379, 2015

DOI: 10.15680/ijircce.2015.0301031

Google Scholar

[7] P. Atre, S. Bhagat, N. Pooniwala, and P. Shah, Efficient and Feasible Gesture Controlled Robotic Arm, in 2018 Second International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 2018, pp.1-6

DOI: 10.1109/ICCONS.2018.8662943

Google Scholar

[8] J. Paterson and A. Aldabbagh, Gesture-Controlled Robotic Arm Utilizing OpenCV, in 2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA), Ankara, Turkey, 2021, pp.1-6

DOI: 10.1109/HORA52670.2021.9461389

Google Scholar

[9] W. Afzal, Gesture control robotic arm using Flex Sensor, Applied and Computational Mathematics, 6(4), p.171, 2017

DOI: 10.11648/j.acm.20170604.12

Google Scholar

[10] T. M. N. U. Akhund et al., IoST-Enabled Robotic Arm Control and Abnormality Prediction Using Minimal Flex Sensors and Gaussian Mixture Models, IEEE Access, vol. 12, pp.45265-45278, 2024

DOI: 10.1109/ACCESS.2024.3380360

Google Scholar

[11] E. Kim, J. Shin, Y. Kwon, and B. Park, EMG-based dynamic hand gesture recognition using edge AI for Human-Robot interaction, Electronics, 12(7), p.1541, 2023

DOI: 10.3390/electronics12071541

Google Scholar