Software and Hardware Complex Development toMonitor Production Activities Based on the YOLOv8 Neural Network

Authors

  • A. M. Presnetsov Kalashnikov Izhevsk State Technical University
  • A. P. Tyurin Kalashnikov Izhevsk State Technical University

DOI:

https://doi.org/10.22213/2410-9304-2023-2-140-151

Keywords:

image markup, YOLO neural network, machine vision, ESP8266 microcontroller, internet of things, monitoring of production activities

Abstract

The article discusses the features of software and hardware complex development to monitor the efficiency of employees by means of an ESP8266 microcontroller and video cameras with a YOLOv8 neural network that meets the requirements of a "smart shop". The purpose of this study is to justify and evaluate the working concept of the created software and hardware complex in the form of a working prototype. Examples of current developments of similar monitoring systems realized within the paradigm of the Internet of Things, are given. However, the systems found in international practice focus on processing images of moving personnel only and do not analyze the technical parameters of the equipment this personnel work on. This study attempts to overcome this limitation. A schematic diagram of the system operation consisting of four key modules has been developed: a data collection module from machine-tools, an employee identification module, a primary data processing module, a web server with software application to manage the entire system in real time. The MQTT (Message Queuing Telemetry Transport) protocol is used for communication between devices and the server in the designed system to provide quality of data transmission under conditions of channel bandwidth limitations. Combined analysis of data from both machine-tools (control of the load of the electric motor) and the movement of workers in the space of the production room is proposed within the concept of developing such systems for the first time. Evaluation tests on the monitored data showed the operability of the entire complex, the problem of object automatic detection in the camera image flow and their location within the coordinate system of the production room was solved.

Author Biographies

A. M. Presnetsov, Kalashnikov Izhevsk State Technical University

Master’s Degree student

A. P. Tyurin, Kalashnikov Izhevsk State Technical University

DSc in Engineering, Associate Professor

References

Leira F. S, Helgesen H. H, Johansen T. A. &Fossen T. I. (2021). Object detection, recognition, and tracking from UAVs using a thermal camera. J Field Robotics. vol. 38, pp. 242-267. DOI: 10.1002/rob.21985.

StoicaG. V., DogaruR. &StoicaE. C. (2014). Speeding-up image processing in reaction-diffusion cellular neural networks using CUDA-enabled GPU platforms. In "Proceedings of the 2014 6th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)", Bucharest, Romania, pp. 39-42, DOI: 10.1109/ECAI.2014.7090162.

Challapalli S. S. N., Kaushik P., Suman S., Shivahare B. D., Bibhu V. & Gupta A. D. (2021). Web Development and performance comparison of Web Development Technologies in Node.js and Python. In "2021 International Conference on Technological Advancements and Innovations (ICTAI)", Tashkent, Uzbekistan, pp. 303-307, DOI: 10.1109/ICTAI53825.2021.9673464.

Ferdous M. &Ahsan S. M. M. (2022). PPE detector: a YOLO-based architecture to detect personal protective equipment (PPE) for construction sites. PeerJComput. Sci. 8:e999. DOI: 10.7717/peerj-cs.999.

Филичкин С. А., Вологдин С. В. Применение нейронной сети YOLOv5 для распознавания наличия средств индивидуальной защиты // Интеллектуальные системы в производстве. 2022. Т. 20, № 2. С. 61-67. DOI: 10.22213/2410-9304-2022-2-61-67.

Redmon J., Divvala S., Girshick R.&Farhadi A. (2016). You Only Look Once: Unified, Real-Time Object Detection. In "2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)", pp. 779-788. DOI: 10.1109/CVPR.2016.91.

Li G., Song Z. & Fu Q. (2018). A New Method of Image Detection for Small Datasets under the Framework of YOLO Network. In "2018 IEEE 3rd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)", Chongqing, China, pp. 1031-1035, DOI: 10.1109/IAEAC.2018.8577214.

Balakrishnan B., Chelliah R., Venkatesan M. & Sah C. (2022).Comparative Study On Various Architectures Of Yolo Models Used In Object Recognition. In "2022 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS)", Greater Noida, India, pp. 685-690, DOI: 10.1109/ ICCCIS56430.2022.10037635.

Официальный репозиторий YOLOv8. URL: https://github.com/ultralytics/ultralytics (дата обращения: 13.01.2023).

Тимошкин М. С., Миронов А. Н., Леонтьев А. С. Сравнение YOLOV5 и FASTERR-CNN для обнаружения людей на изображении в потоковом режиме // Международный научно-исследовательский журнал. 2022. № 6 (120). DOI: 10.23670/ IRJ.2022.120.6.020.

HmissiF. &OuniS. (2022). TD-MQTT: Transparent Distributed MQTT Brokers for Horizontal IoT Applications. In "2022 IEEE 9th International Conference on Sciences of Electronics, Technologies of Information and Telecommunications (SETIT), Hammamet, Tunisia, pp. 479-486, DOI: 10.1109/SETIT54465.2022.9875881.

Filipe L., Peres R. S. & Tavares R. M. (2021), Voice-Activated Smart Home Controller Using Machine Learning. In IEEE Access, vol. 9, pp. 66852-66863, DOI: 10.1109/ACCESS.2021.3076750.

MachesoP., MandaT. D., ChisaleS., DzupireN., MlathoJ. &MukanyiligiraD. (2021), Design of ESP8266 Smart Home Using MQTT and Node-RED. In "2021 International Conference on Artificial Intelligence and Smart Systems (ICAIS)", Coimbatore, India, pp. 502-505, DOI: 10.1109/ICAIS50930.2021.9396027.

Ilin Igor, Shirokova Svetlana & Lepekhin Aleksandr (2018). IT Solution concept development for tracking and analyzing the labor effectiveness of employees. E3S Web Conf. vol. 33, article number 03007. DOI: 10.1051/e3sconf/20183303007.

Al Jassmi, H., Al Ahmad, M. & Ahmed, S. (2021). Automatic recognition of labor activity: a machine learning approach to capture activity physiological patterns using wearable sensors, Construction Innovation, vol. 21 no. 4, pp. 555-575. DOI: 10.1108/CI-02-2020-0018.

Jihong Yan &Zipeng Wang (2022). YOLO V3 + VGG16-based automatic operations monitoring and analysis in a manufacturing workshop under Industry 4.0. Journal of Manufacturing Systems, vol. 63, pp. 134-142. DOI: 10.1016/j.jmsy.2022.02.009.

Cheng Min-Yuan, KhitamAkhmad F.K. &Tanto H.H. (2023). Construction worker productivity evaluation using action recognition for foreign labor training and education: A case study of Taiwan, Automation in Construction, vol. 150, article number 104809, DOI: 10.1016/j.autcon.2023.104809.

Konstantinou, E., &Brilakis, I. (2019). Monitoring construction labour productivity by way of a smart technology approach. In "Proceedings of the Institution of Civil Engineers - Smart Infrastructure and Construction", 172, Article 2. DOI: 10.1680/ jsmic.20.00014.

Емельянович А. А., Коваль С. В., Галимова А. Н. Управление рабочим временем как способ повышения производительности труда // Вестник Кемеровского государственного университета. Серия: Политические, социологические и экономические науки. 2021. Т. 6, № 2. С. 208-218. DOI: 10.21603/2500-3372-2021-6-2-208-218.

Skryhun N., & Nyzhnyk S. (2020). Time management as an important component of successful business activities. MiddleEuropeanScientificBulletin, 2, 13-15. DOI: 10.47494/mesb.2020.2.13.

Published

30.06.2023

How to Cite

Presnetsov А. М., & Tyurin А. П. (2023). Software and Hardware Complex Development toMonitor Production Activities Based on the YOLOv8 Neural Network. Intellekt. Sist. Proizv., 21(2), 140–151. https://doi.org/10.22213/2410-9304-2023-2-140-151

Issue

Section

Articles