Progress on prototype
Since the initial Base Converter prototype was designed as a desktop game, I need to redesign some components so that they are appropriate for VR. This includes important considerations like scale - how big should the loom be in relation to the player, clicking - how will the player make their selection, and even typing which is another big issue that I will for now ignore as it is more of a technical issue I can solve after I am more confident in my concept (though I do admit that exploring typing now may elicit some interesting discoveries).
For this week, I have redesigned the environment in all 3 scenes and added additional VR functionalities like allowing the player to walk around the space. Scene-change buttons can now be 'teleported' to and as they are set as triggers, they will transport the audience to the next scene. However, interaction with other non-scene-change buttons need to be more thought out.
Within the start scene, players can now move and look around the floating platform.
They can climb up the rope tower and look at the scene from the top.
The player is no longer restricted to facing the 'front'. Instead they have a 360 view of the scene.
View of the sky.
The weaving scene is now also VR friendly, allowing players to move around and explore their own woven structure.
A unique feature is how the game allows players to view their creation from different angles as though they are inside what they created. Scale plays an important role here.
In order to enable more intuitive weaving experience, an alternative controller is needed. To start off, I needed to figure out a way to control what happens in Unity with Arduino so I can later design a controller using physical computing components. I began by looking for tutorials on how to do this and found a plug-in in the asset store that uses serial communication. But their code proved to be unnecessarily complex for what I wanted to test with very little documentation available. After some more digging, I found a much simpler script that enabled me to control the weaving happening in Unity with a button (again, through serial communication). Then I came up with an idea to create a more ergonomic form that resembles the shuttle used to weave on a loom and use a force sensitive resistor to 'activate' the weaving, just like how the button did.
One of the feedback on the non-VR version of this game was that the use of the button to complete the weaving on screen felt "extremely antithetical to the concept, which is inherently physical". The question raised then was that weaving itself is inherently physical while weaving on screen does not give the same embodied experience. However, the incorporation of a more intuitive touch interface (the force sensitive resistor - aka touch sensor) this time around in the VR version of the game seems to have made the weaving experience more embodied as it appeared that the test subject who made the initial comment on the non-VR version now felt that the VR version with the touch sensor was successfully recreating some of that embodied, physical experience.
Moreover, other users also seemed to enjoy the ability to walk around the scene and view their woven structure from different angles.
However, the teleportation from one scene to another felt a bit confusing and non-intuitive to some users. This transition is currently happening by getting the player to teleport into an object which acts as a trigger that takes the user to a different scene. Essentially you have to 'bump into' an object in VR to move on to the next scene. This transition would need to be more carefully thought out to make it more intuitive which will also depend on the kind of controller the users have in their hands when they're playing the game - a custom, more intuitive controller, shaped specifically for the kind of craft work they perform in VR.