Industry 4.0 marks a shift toward fully automated digital production, where intelligent systems manage processes in realtime and interact continuously with their environment. Central to this evolution is robotic technology, which enhances productivity and precision in manufacturing. A key aspect of this advanced production model is human-robot interaction, where operators and robots work together on complex tasks. Ensuring safe collaboration between humans and robots is a primary objective. This paper proposes a method for human gesture recognition based on multi-sensor data fusion. By incorporating data from multiple sensors, we achieve a more complete and robust representation of gestures. Our approach involves an algorithm that classifies human movements in real-time using visual data. The process consists of several steps: data preprocessing, feature extraction, data integration, and gesture classification. By employing machine learning and deep learning techniques for feature extraction and analysis, we aim to achieve high accuracy in recognizing gestures.
|