Congratulations to Vanessa Wirth, this year`s Digital Rheumatology Research Award winner.

Vanessa Wirth

M.Sc. Computer Science

FAU Erlangen-Nürnberg, Germany


Markerless hand tracking technologies are not yet applied in clinical practice, despite their potential to support the diagnosis and monitoring of disease activity in inflammatory musculoskeletal diseases. Most related methods focus on reconstructing coarse, plausible poses as they are heavily tailored to entertainment use cases such as Virtual and Augmented Reality or Human-Computer Interaction. Thus, they are lacking the accuracy and reliability that is required for clinical applications. To bridge the gap between coarse, interactive hand simulation and accurate but time-intensive markerbased
motion capturing, we propose ShaRPy, the first RGB-D Shape Reconstruction and hand Pose tracking system, which provides uncertainty estimates of the computed pose, e.g., when a finger is hidden or its estimate is inconsistent with the observations in the input, to guide clinical decision-making.
Besides pose, ShaRPy approximates a personalized hand shape, promoting a more realistic and intuitive understanding of its digital twin. Our method requires only a lightweight setup with a single consumer-level RGB-D camera, yet it is able to distinguish similar poses with only minor joint angle deviations in a metrically accurate space. This is achieved by combining a data-driven dense correspondence predictor with traditional energy minimization. We leverage a parametric hand model in which we incorporate biomedical constraints and optimize for both its pose and hand shape parameters. To prove its applicability, we evaluate the accuracy of ShaRPy using a state-of-the-art keypoint detection benchmark and show qualitative results of hand function assessments for activity monitoring of musculoskeletal diseases. We conclude that ShaRPy is a helpful tool to study and monitor disease activities. To foster applications in the health domain, we will release ShaRPy as open-source in the near future.