Hand-Centric Computer Vision for Assisted Small Parts Technical Assembly

Description

We rely on our hands in nearly all daily activities. This thesis focuses on activity tracking in small parts industrial assembly from a hand-centric viewpoint to enable more dynamic supervision in environments where workers frequently move within the assembly area.

We rely on our hands in nearly all daily activities. This thesis focuses on activity tracking in small parts industrial assembly from a hand-centric viewpoint to enable more dynamic supervision in environments where workers frequently move within the assembly area. Additionally, the model is expected to be able to anticipate future actions and provide feedback for assistive assembly (Ragusa et al., 2023). Traditional action recognition models rely on an egocentric viewpoint such as models trained on the EgoHands dataset (Bambach et al., 2015) or the Assembly 101 model for small carts assembly activity recognition (Sener et al., 2022). These existing models struggle with precisely tracking and recognizing hand interactions within the workspace from hand-centric viewpoints in first-person industrial assembly settings. They often lack the ability to capture the dynamics between hands and objects, which is required for accurate small parts assembly tracking (Pei et al., 2025). This thesis proposes a computer vision-based system that detects and recognizes hands and assembly parts in the working area using YOLO (You Only Look Once) for object detection and tracking (Hashimoto et al., 2019), aiming to identify working hands and small parts of the assembly within the workspace.

Downloads

8.08 MB

Details

Contributors
Date Created
2025
Topical Subject
Language
  • en
Note
  • Partial requirement for: M.S., Arizona State University, 2025
  • Field of study: Engineering
Additional Information
English
Extent
  • 43 pages
Open Access
Peer-reviewed