For my World’s on a Wire midterm I decided to continue some of my previous work with virtual instruments. Before I used Unity to explore deconstructing and playing songs by navigating through an environment via a 3rd person avatar.
This time around I approached this through a VR headset and using Unreal engine. My idea was to trigger different sound elements by throwing objects at targets. My hope was to have the targets indicate they have been triggered not only with a sound clip but also physically by spinning and triggering lights.
Unfortunately this wasn’t a 1 to 1 translation between the two engines. It proved more difficult trying to translate a good experience into VR as well. Here is a demo of my midterm:
I was hoping to have the sound objects spin, but I was not able to combine that in time. To demonstrate the rotation element I created a demo spinner. I would like to tie that with an audio attribute. For example spinning faster can make the sound louder and or speed up or reverse playback. I also knew turning the playback off by hitting the targets again wouldn’t be as easy, so that is another aspect I need to explore.
These interactions can be used for storytelling by being able to explore songs, words or other sounds in “tangible” way.
I think this could be worth exploring for my final time permitting. I found it can be too easy to get distracted with technical details and lose momentum in the narrative itself.