Portable Biomechanics Laboratory: Clinically Accessible Movement Analysis from a Handheld Smartphone

J.D. Peiffer1,2, Kunal Shah1, Irina Djuraskovic1,3, Shawana Anarwala1, Kayan Abdou1, Rujvee Patel4, Prakash Jayabalan1,5, Brenton Pennicooke4, R. James Cotton1,5

1Shirley Ryan AbilityLab
2Biomedical Engineering, Northwestern University 3Interdepartmental Neuroscience, Northwestern University 4Neurological Surgery, Washington University School of Medicine 5Physical Medicine and Rehabilitation, Northwestern University Feinberg School of Medicine

Portable Biomechanics Laboratory allows 3D biomechanical analysis from a moving camera. Perfect for in-clinic evaluation of movement!

Abstract

We present the first clinically-validated method for biomechanical analysis of human movement from a handheld smartphone.

The way a person moves is a direct reflection of their neurological and musculoskeletal health, yet it remains one of the most underutilized vital signs in clinical practice. Although clinicians visually observe movement impairments, they lack accessible and validated methods to objectively measure movement in routine care. This gap prevents wider use of biomechanical measurements in practice, which could enable more sensitive outcome measures or earlier identification of impairment.

In this work, we present our Portable Biomechanics Laboratory (PBL), which includes a secure, cloud-enabled smartphone app for data collection and a novel algorithm for fitting biomechanical models to this data. We extensively validated PBL’s biomechanical measures using a large, clinically representative and heterogeneous dataset with synchronous ground truth. Next, we tested the usability and utility of our system in both a neurosurgery and sports medicine clinic.

We found joint angle errors within 3 degrees and pelvis translation errors within several centimeters across participants with neurological injury, lower-limb prosthesis users, pediatric inpatients and controls. In addition to being easy and quick to use, gait metrics computed from the PBL showed high reliability (ICCs > 0.9) and were sensitive to clinical differences. For example, in individuals undergoing decompression surgery for cervical myelopathy, the modified Japanese Orthopedic Association (mJOA) score is a common patient-reported outcome measure; we found that PBL gait metrics not only correlated with mJOA scores but also demonstrated greater responsiveness to surgical intervention than the patient-reported outcomes.

These findings support the use of handheld smartphone video as a scalable, low-burden, tool for capturing clinically meaningful biomechanical data, offering a promising path toward remote, accessible monitoring of mobility impairments in clinical populations. To promote further research and clinical translation, we open-source the first method for measuring whole-body kinematics from handheld smartphone video validated in clinical populations.

Workflow

A) Researchers held a smartphone (optionally with gimbal) while following a participant walking. Our system has no specific requirements regarding viewing angle, distance to subject, or therapist assistance. B) Recorded smartphone video and optional wearable sensor data are stored in the cloud, and processed using PosePipe, an open-source package implementing computer vision models for person tracking and keypoint detection. C) To reconstruct movement, we represent movement as a function that outputs joint angles, which—combined with body scaling parameters and evaluated through forward kinematics—generate a posed biomechanical model in 3D space. This untrained model is compared to video-extracted joint locations and optionally smartphone sensor data to compute a loss. This loss guides backpropagation to iteratively refine both the kinematic trajectory and body scale. D) Initially, the representation lacks knowledge of the person’s movements and scale (e.g., height, limb proportions), but after optimization, it typically tracks joint locations within 15 mm in 3D and 5 pixels in 2D.

Workflow diagram

Clinical Analysis

A) Hip and Knee flexion angles of clinical and control groups B) GDI separates groups at risk of falls determined by the Berg Balance Scale. C) GDI correlates with 10 Meter Walk Test performance r = 0.82. D) GDI of Lower Limb Prosthesis Users (LLPU) and Knee Osteoarthritis (KOA) participants is significantly lower than that of control populations. Further, GDI of Transfemoral amputees is significantly lower than GDI of Transtibial amputees. E) GDI collected in clinical settings correlates (r = 0.47) with the mJOA, a clinically used ordinal questionnaire.

Clinical Analysis

Related Links

This builds off of a large body of work our lab has been doing in multiview markerless motion capture.

PosePipeline is an open-source package implementing state of the art computer vision models in a nicely organized database. You can find it on GitHub here.

Markerless Motion Capture and Biomechanical Analysis Pipeline describes our multi-camera data collection system and some early biomechanical analysis results.

Differentiable Biomechanics Unlocks Opportunities for Markerless Motion Capture uses the same end-to-end optimization strategy for multi-view RGB video.

Care about validation with optical motion capture? In this work we perform a head-to-head comparison between optical motion capture and our markerless approaches (in clinical populations).

Want wearables? This work describes an approach to fuse PBL's video-based biomechanics with wearable sensors. We find that video does well except during periods of occlusion.