Big project

Mobile AR motion tracking for physical therapy

Semi-finalist at Penn Apps, Fall 2017



Within the world of physical therapy, and healthcare in general, there are three problems that Bomo seeks to really solve:


Software to accurately track joint flexion, walking patterns, and body movement is extremely costly and hard to setup in a normal environment: e.g., a home, gym, or small doctor’s office.


In order to actually get accurate measurmenets, you likely have to go to a motion analysis lab and have the ability to access one in the first place.


Physical therapists and doctors often don’t take enough care to consistently take accurate measurments and often “eyeball” their results.

Bomo is a mobile-first, AR tracking system for physical therapy that tries to solve these problems by providing an inexpensive, consistent, accurate, and convenient way to track joint health and communicate to healthcare providers.

Early work & inspiration

Before Bomo was actually created, Bomo's older sibling had its origins in the Emory MOTIONS lab as a flex-sensor experiment.

Our idea was to convert changes in flex sensor values into joint angles.

We hooked up our device onto the fronts of our ankles like so:



And we ended up with results like these, giving us some great insight into the feasibility of estimating the measurements of expensive, 3D systems with simple sensors.

Left: Flex sensor. Right: 3D motion tracking.
Left: Flex sensor. Right: 3D motion tracking.
Left: Flex sensor. Right: 3D motion tracking.

We ended up creating a game for our test subjects to play with during sessions which ended up being a phenomenal way to make the data and research relatable and personal.



The main concept here was that over time, users would play the game and gradually increase the flexibility of their ankles by receiving realtime visual feedback.


We ended up completing the first working version of Bomo in 30 hours at PennApps XVI, where we placed as Semi-Finalists, or in the top 30/300 of the competing teams. We quickly iterated and built multiple custom integrations between Vuforia and Apple's native 3D rendering engine in order to create the first prototype.

We were mainly trying to implement several key features:

1. Allow caregivers to measure joint health dynamically in realtime, with minimal invasiveness.
Facilitate patient-doctor communication, and make at-home data collection easy.

The Hackathon Submission:


Current Medical Use:

I brought this to researchers at Georgia Tech and they happily brought the software onboard to be used in TMJ to measure intricate jaw movements of children, and cross-compare the data to microphone data.