Logic Pro AR

Idea

Interaction design for the future of music production

Brief

Project brief

Being a musician myself, I often think about the tools we use for music production and how audio engineering can be a lot more intuitive. This is one of my ideas for how AR applications might influence musical creativity and editing.

Role
Sole Designer
Timeline
December 2017 (1 weekend)
Collaborators

Idea

How will AR affect music production in the future?

However, a more interesting question to ask is:

What features in current music production software would benefit most from AR and VR?

Looking at most music production software, we see lots of 2-dimensional representations of the 3-dimensional properties of sound.

How sound propogates through air.

Looking at standard EQ interfaces, we can see that different frequencies "overlap" with eachother, making it pretty cumbersome to get a full understanding of the separations of frequencies within sound waves.

EQ Interface for Logic Pro X.

My first idea was to simply take these EQ curves and expand them into 3D, allowing users to "look around" different frequencies.

I took this a step further and thought about how we could bring AR interactions into MIDI rolls and automation. Here is a sketch

Off to After Effects!

I created some basic assets in illustrator and created a short demo of how an interface like this could be interacted with in the near future. Pretty cool I think.

 

For the future:

Soon, I'll be creating an entire system of gesture based music production. Here are some sketches where I explore transitions between sitting and standing contexts, and how simple controls (for example, volume for a track) go from being controlled by the hand to being controlled by the entire body.

Up next: 
Amplitext