Bomo

Big project

Mobile AR motion tracking for physical therapy

Semi-finalist at Penn Apps, Fall 2017

Brief

Project brief

In September 2017, three friends and I went to Penn Apps where we placed as semi-finalists with our app Bomo: an AR motion tracking app. We later used Bomo in several scientific experiments as a data collection tool for TMJ.

Role
Lead Software Engineer
Timeline
September 2017 - December 2017 (3 months)

What's this?

Bomo is a mobile-first,AR tracking system for physical therapy that provides an inexpensive, consistent, accurate, and convenient way to track joint health.

Our final submission video for Penn Apps 2017 can be seen below:

I was the lead engineer and directed the project vision. I worked with two other amazing engineers- my friends Jake Cronin and Antonio Chan- and the very talented designer Hiroo Aoyama.

JakeHirooAntonio, me. After 36 hours of no sleep I'm surprised we could stand up for this picture.

The hackathon

For a brief summary, our devpost can be found here.

Problem identification

Within the world of physical therapy, and healthcare in general, there are three problems that Bomo seeks to really solve:

COST:

Software to accurately track joint flexion, walking patterns, and body movement is extremely costly and hard to setup in a normal environment: e.g., a home, gym, or small doctor’s office.

CONVENIENCE:

In order to actually get accurate measurmenets, you likely have to go to a motion analysis lab and have the ability to access one in the first place.

CONSISTENCY:

Physical therapists and doctors often don’t take enough care to consistently take accurate measurments and often “eyeball” their results.

The pitch

You’re playing a game of intense basketball. As you are about to shoot the tie-breaking shot, you tear your ACL. You go to the doctor sulking in defeat, and during weeks of physical therapy, your PT arbitrarily asks you to “try a little harder than last week” during each session. However, your PT soon hears about the app BOMO, where they are now able to objectively track your knee recovery with just a smart phone. With BOMO, you are now able to have extremely detailed records of your rehabilitation and see your progress in an immersive and personalized way without the use of expensive hardware or software.

BOMO can not only track joint flexion angles, which are useful for measuring flexibility progress, but it can also track joint movement in 3d, allowing users to analyze common dysfunctions such as lateral knee movement during walking and running, tracking overall joint stability, and measuring muscular imbalances, all with just a smartphone.

UX Design

All of this started with a night of sleepless sketching and brainstorming, as always.

After some coffee and minimal sleep, we consolidated our whiteboard mess into a few key UX design choices and personas. All of the design deliverables were created together with Hiroo Aooyama.

After persona development, Hiroo and I went over basic low-fidelity wireframing. The primary user flow allowed the user to select the movement they wanted to train, then prepared a pop-up screen to clearly show the users how to set up the app before being thrown into the AR camera screen.

We then designed a wireframe for our stats page. We has three primary categories: past, today, and goal. During some brief user testing and interviewing, a student during the hackathon told Hiroo that there was "no point in showing all three under a single page," and that he should have only seen "today" as the default category. A main graph shows daily or weekly progress on improvement in range of motion, and a summary section complements the graph with further statistics. Tips is our final section to go through the statistics and present a constructive criticism for the user to speed up their progress.

Prototyping and high fidelity design

We quickly iterated and built multiple custom integrations between Vuforia and Apple's native 3D rendering engine in order to create the first prototype. We designed alongside devlopment and the entire process was incredibly interdisciplinary. Designers coded and coders designed.

Our basic workflow basic flow was creating the overall UI on Sketch then export icons or dimensions for us to code in Swift. In the background, Jake and I were always working on the core AR rendering tech that we could integrate into basically any UX use case we were imagining.

After we finished the front page, Hiroo designed the flow of choosing the category and opening the AR. 

We wanted to really focus on improving movement, not body parts, so Hiroo made six popular fitness movement icons. For development purposes, we focused on the squat (because I love squatting and it's one of the most relevant demos for people!).

We were often faced with the "can we actually build this?" challenge. Hiroo designed the UI of our AR component, but because that really depends on what we could realistically develop, he left it very simple and flexible for any changes.

Afterwards, Hiroo and I designed the stats page, but this time we added three additional categories for filtering the graph: Depth, Velocity, and Power. 

We did basic user interviews at the hackathon and saw that, from a sports science perspective, people were really interested in understandning more in-depth metrics about performance. Velocity can tell our users to slow down or speed up while doing the exercise, while power can tell our users how much calories our users have burned or how much much they should change thier velocity. These metrics are incredibly important for training things like olympic weightlifting movements or jumping ability.

The final submission

After paper-macheting this app together, we were incredibly pleased to qualify as semi-finalists in the hackathon. Keep reading to see where the idea really came from and how we used Bomo in real scientific research!

Early work & inspiration

Before Bomo was actually created, Bomo's older sibling had its origins in the Emory MOTIONS lab as a flex-sensor experiment.

Our idea was to convert changes in flex sensor values into joint angles.

We hooked up our device onto the fronts of our ankles like so:

 

 

And we ended up with results like these, giving us some great insight into the feasibility of estimating the measurements of expensive, 3D systems with simple sensors.

Left: Flex sensor. Right: 3D motion tracking.
Left: Flex sensor. Right: 3D motion tracking.
Left: Flex sensor. Right: 3D motion tracking.

We ended up creating a game for our test subjects to play with during sessions which ended up being a phenomenal way to make the data and research relatable and personal.

 

 

The main concept here was that over time, users would play the game and gradually increase the flexibility of their ankles by receiving realtime visual feedback.

Applications

Current medical use:

I brought this to researchers at Georgia Tech and they happily brought the software onboard to be used in TMJ to measure intricate jaw movements of children, and cross-compare the data to microphone data.

 
 
Up next: 
Sounds of Manhattan