Abstract—The recent ongoing coronavirus pandemic has highlighted the importance of hand hygiene practices in our daily lives, with governments and health authorities around the world promoting good hand hygiene practices. More than 1 million cases of hospital-acquired infections are reported in Europe annually. Hand hygiene compliance may reduce the risk of cross- transmission thereby reducing the number of infections as well as health-care expenditures. In this paper, WHO hand hygiene gestures were recorded and analyzed with the construction of an aluminum frame, placed at the laboratory’s sink. The hand hygiene gestures were recorded for thirty participants after conducting a training session about hand hygiene gestures demonstration. The video recordings were converted into image files and were organized in six different hand hygiene classes. The Resnet-50 framework was selected for the classification of multi-class hand hygiene stages. The model was trained with the first set of classes (Fingers Interlaced, P2PFingers Interlaced, and Rotational Rub) for 25 epochs. An accuracy of ~44% was achieved for the first set of experiments with loss score >1.5 in validation set. The training steps for the second set of classes (Rub hands palm to palm, Fingers Interlocked, Thumb Rub) were 50 epochs. An accuracy of ~72% was achieved for the second set with the loss score < 0.8 for the validation set. In this work, a preliminary analysis of robust hand hygiene dataset with transfer learning was carried out with a future aim of deploying a hand hygiene prediction system for healthcare workers in real-time.
Index Terms—Hand hygiene, hand washing, computer vision, deep learning, transfer learning, image recognition.
Rashmi Bakshi is with the School of Electrical and Electronic Engineering. Technological University, City Campus, Ireland (e-mail: email@example.com).
Cite: Rashmi Bakshi, "WHO-Hand Hygiene Gesture Classification System," International Journal of Machine Learning and Computing vol. 12, no. 6, pp. 312-317, 2022.Copyright @ 2022 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).