Digitisation of Human Body Motion for Garment Production

Principal Investigator: Professor Norman C. TIEN (Department of Electrical and Electronic Engineering)

This project is showcased in the inaugural exhibition – Engineering for Better Living in Innovation Wing Two

Project information

Introduction

Garment production is a laborious process that relies primarily on manual operations. Smart robots are set to play a vital role in future automations and assist human workers with repetitive and/or high-risk tasks. To achieve interactive human-robot collaborations, robots need to learn and understand how humans work and thus a cost-effective means of digitising manual operations is of the essence. In this project, we aim at developing an innovative approach to high-fidelity, real-time full-body motion capture for garment workers without using specialty cameras.

Novelty of the Project

Existing methods for 3D full-body motion capture utilise either multiple high-end industrial cameras or RGB cameras equipped with depth sensors, which are costly and usually require sophisticated setups. As regards human pose estimation, most works mainly estimate the 3D coordinates of each joint of the human body, neglecting the fact that it is the rotation of skeletal bones that result in different human poses. Our approach treats a human body as a kinematic tree and learns to infer the rotation of each skeletal bone. Using monocular videos as input, our algorithm incorporates the knowledge learned from a large-scale dataset and leverages a fusion strategy to integrate both human body and hand poses so that a pose estimation with improved correspondence with the actual human body can be realised.

Benefit to the Community

Besides virtual reality and 3D gaming that first come to the spectators’ mind, the developed technology will facilitate human-robot collaborations in the labour-intensive garment industry, where the motions and activities of human workers are digitally captured via the system and analysed for motion intents in order to predict workers’ subsequent movements using artificial intelligence. With such advanced capability, robotic assistance can be effectively implemented that enhances the productivity and safety of various production and handling tasks. This technology using a commodity digital camera may also find promising applications in monitoring the elderly for enabling ageing—in-place and evaluation of postures and movements during rehabilitation.

About the scholar

Professor Tien is the Chair Professor of Microsystems Technology at HKU and Managing Director of the Centre for Transformative Garment Production. He is an expert in MEMS technology and robotics.

Project poster
Project video
Project images
About the Centre

The project represents a part of the large-scale research programmes of the Centre for Transformative Garment Production, which is jointly formed by HKU and Tohoku University, Japan to tackle real-world technical problems facing the garment production industry. The Centre has been admitted to the AIR@InnoHK Research Cluster under the funding support of Innovation and Technology Commission of the HKSAR Government.

Enquiry / Feedback

Please feel free to give your enquiry / feedbacks to the research team by filling the form (https://forms.gle/JV59N47nTj19ndYz6). Thank you!