icon picker
Reflexive journal

VRCollab-Reflexive-Journal_template_2023.docx
34.4 kB

September

9/22/2023

Decide and develop concept
Getting developer environment setup, trying to get Unity XR Hands to work with Meta Quest headset and running Unity automated tests in CI. Realised this was low priority task at this stage
Getting collaborative tools setup (Slack for chat and Coda for documentation, project management)
Documented project goals, audiences, schedule, team members and responsibilities, as well as recording questions and ideas in public-facing Coda:
Uni session to refine the concept: useful to gain confidence presenting the ideas to others and useful feedback questioning where does the experience fit into runners lives and should the experience ask questions of runners to start the experience to personalise it for them
Met up with Rebecca to refine the concept, she had some good ideas around suggesting only 2 narratives to runners (using initial personalisation to find a relevant and alternative narrative), as all 6 narratives may be too long an experience. For audience asked who will this have the most impact for, so I narrowed it to

6/10/2023

Did visual moodboard. Discussion with Rebecca who questioned the neon / club aesthetic (think I just did generic VR aesthetic). Both drawn to the bold bold colour and graphic lines of athletics track. Refined that to basic visual identity with brand personality, colours and typography in Figma. Not tested in VR yet.
R suggested doing some creative writing to answer questions around experience. I wrote an intro, tried editing the tone of voice in chatGPT. This resulted in idea of a guide who takes you through the experience, a friendly peer. R suggested the track could talk, which give interesting visual FX opportunities to change appearance/pattern of track lines for narrative purposes
Did not collect motion and audio examples as I had planned
Some initial development environment setup, having issues with getting native hand tracking to work in unity play mode in either Pico or meta headsets
Had a call with research author Ciara Everard which was awesome. We both have similar approach to the sports psych elements and she was very encouraging/interested in the project and offered to meet online once a week for duration of project. She helped answer a lot of big questions around whether our suggested approach would work so this will really help the project having sports psych expertise. Feeling really motivated by this
Plan for sprint week:
Storyboard user journey ideas
Write out dialogue for minimal end to end to end experience. (E.g. intro, one or two questions and outro) in format that can be user tested
User test the dialogue
Start building out minimal e2e in VR/MR, focus on interaction not visuals

Build initial wk1/3 (Sprint 1)

Original plan was to try out different user journeys (e.g. narrative scene explore vs consultation vs interactive questionnaire). Duncan suggested this was too much for this week so scaled back to doing basic end-to-end involving intro and one narrative explore scene, resilience narrative.
Thought about scene design, decided to work with Merry-Go-Round as its the most dramatic. Watching the video there are so many visual metaphors or things that could work in VR but its hard to think of an overarching visual metaphor or how to tie the info about the narrative/athlete stories together. Its hard to balance users agency against needing them to understand things in a certain order (e.g. the 3 stages of merry-go-round). Feeling like there are a lot of exciting ideas (merry-go-rounds with athlete bodies, mirrors showing distorted/disconnected bodies, glasses that can be put on as lenses to the world) but a bit out of my depth with interaction/narrative design

10/10/2023

Planned to meet with Duncan to discuss interaction design options for merry go round scene but he was unavailable until tomorrow. Switched to setting up dev env. Decided to go full Meta/Oculus to test out that ecosystem. Upgrade to latest Oculus Unity SDK and get Building Blocks which was cool. Having issues with Play Mode (pass-through not working, hand-tracking does), unsure of best setup for Meta headsets, I was using OpenXR/SteamVR through Oculus Link, Dominque advised that using Oculus and Oculus Link allowed for pass-through to work. I switched to that after resolving a e with Oculus not being set as active OpenXR runtime.
At that point, pass-through worked in play mode and built apk, but hand tracking only worked in built apk, not play mode.
We rebuilt the project without Building Blocks and using Dominque’s steps. After these pass-through worked in play mode and built apk, but hand tracking did not work in either mode. There was a warning about experimental features when building, using the modal to change the setting didn’t work, had to send an ADB command vis meta quest developer hub as per pass-through worked in play mode and built apk, but hand tracking only worked in built apk, not play mode
At that point, again pass-through worked in play mode and built apk, but hand tracking only worked in built apk, not play mode.
Possible points of failure
Headset
PC
Oculus app
Unity
App version
Oculus SDK
Pass-through settings
Hand-tracking settings

11/10/2023

Talking through narrative design with Rebecca who needed to know the key points from each scene and user emotions so other elements can be designed. She wrote these up from the video (
Broken link
). Discussing the user journey and interactions and decided a few things:
Within merry-go-round scene there is a linear narrative (the 3 stages from the video). There may be elements of user agency within the 3 stages. This is because the merry-go-round narrative has a linear path and it works well in the video
Physical manipulation of objects in hand tracking is important to bring the story to life (e.g. holding an athletes shoes). Manipulation could control other visual/audio elements if relevant
Need to decide if there is one visual/interactive element that persists through the scene (e.g. a merry-go-round track that changes throughout the scene) or if its is separate visual/interactive elements
Technical issues (hand tracking in play mode) persist which is taking up most of my time. Resolved at end of day by switching PCs

12/10/2023

Now hand tracking is resolved, finish off dev env with basic hand-tracked MR scene. Feel a bit lost as can’t visualise the scene, and a bit burnt out by the technical issues / lack of progress in the sprint.
Try throwing 3D model of track in the scene after editing it in Sketchup. Scale is important, it feels a bit strange having a tiny track around you, also the angle is important as the asset has no height, so it needs to be low down or angle towards you. Setting it to real scale is interesting but it doesn’t feel like its on the floor to me without feet. I’m not sure the track is the thing to link everything together, but hard to tell at this stage.
Add Oculus Voice SDK for TTS for prototyping script. This is more successful and its good to hear the audio in the scene

13/10/2023

Duncan says one scene could be ambitious on its own (e.g. if there is a merry-go-round that changes in relation to the story or interaction), so just concentrate on getting the video audio into the scene and break it up with interactions to see that works out. Need to build this to answer questions:
How well does the video narrative fit in VR?
Do people listen to the narrative in VR?
Completed that scene with a VO narrator that autoplays, when narrator section finishes a box appears - if the user touches it it plays an athlete story. Cycle repeats for whole scene. Initial thoughts:
Useful to see how narrative plays out in VR (well except feels long)
the hand tracking on the cube kinda feels like your holding the voice which is quite nice. its a strange effect when the cube disappears, like the person vanishes
the interaction gets quite repetitive as you move through the scene
be interesting to see how much the interaction distracts from understanding the concepts
the video relies on text to emphasise points from the voiceover, feels like this might be necesscary, or to simplify the voiceover
initial test feels like a lot of the original videos structure could be used, just edited and augmented - but need to user test
Not too sure where to go next. Wrote up a user test plan

Build initial wk2/3

17/10/2023

Rebecca refined the VO narrative (more concise, leaves out details about research, addresses user more directly) and user research so it was easier to collect answers (changed some open ended questions to use likert scale or multiple choice). Also added it to a form, I feel more confident about testing it now, just need to find some people to test on
Going to sketch out some interactions for athlete stories as more confident these will be in it, while waiting for user test results

18/10/2023

Three main questions to focus interaction sketches:
Starting with 1st one. Sketched out so animation of the virtual box to illustrate the athlete story. Its a bit strange to have something you are holding start floating off and doing things, maybe useful for the body out of control thing. Instead of planning all interactions up front I was just designing each one as I came to it. This meant I didn’t have the right mindset (building not designing), I didn’t consider what was coming before or after and I was trying to put too much into the first interaction.

20/10/2023

Rebecca helped with discussing the big questions above. This was really useful to be decisive about all these things (as time is running out) and particularly planning out the interactions, some can be really simple as others are doing the heavy lifting.
VR or MR?
This will be a Virtual Reality experience as in our user testing all 3 participants found the mixed reality pass-through distracting, with no real benefit. Virtual Reality will help increase immersion and we will have more control over what to focus the user's attention on, as well as more compelling lighting and graphics options. Users tend to move more freely around space in Mixed Reality, but the experience needs to work seated for injured athletes, so this is less relevant for this experience.
What happens in the experience?
The user explores sports injury narratives in a virtual environment. The user connects to athlete stories by using their hands to interact with virtual objects that represent the athletes story.
Basic version
There will be an intro to explain what the experience is about and to get the user used to hand-tracked interactions, so they know what to expect and how to use the experience.
They will proceed to the Merry-Go-Round scene.
This will follow the narrative structure of the Merry-Go-Round YouTube video (narrator voiceover explains concept from injury narrative then it is backed up by athlete story voiceover).
In this experience athlete stories will be triggered by the user interacting (via their hands) with virtual objects that appear in front of the user when the narrator voiceover has finished a section. For example, a virtual shoe will appear and when the user handles the shoe the athlete's story is played. We aim for this to make a physical-emotional connection between the user handling an object and hearing the athlete's story.
There will be 6-8 athlete stories
- Outro can direct users to the Sports Injury website to view the other injury narratives
Extended version / reach goals
As above, but instead of going straight to Merry-Go-Round they can choose between that scene Resilience or Longevity
At the end of a injury narrative scene the user can view a contrasting narrative
The narrator asks the user questions relating to the scene to aid their comprehension, user can respond via voice
What does it look like?
The environment will be a liminal space that allows the virtual objects relating to athlete stories to take centre stage. The liminal space can change in relation to the story (e.g. it may turn dark and rain in the space, or brighten to white)
The aesthetic of the virtual objects is currently unknown, we will work with placeholders to start with (e.g. there will be a 3D object that looks like a running shoe, but aesthetically it may not match up with elements in the scene) and refine the aesthetic over time.
What interactions will happen?
We have worked on a first draft of interactions for athlete stories and objects that trigger them: I will work on sketching these out next and we will iterate on these.

23/10/2023

Meeting with Naomi useful:
Keeping interaction simple and physically holding the stories is nice
Think about environment (e.g. gym or mountains)
Sound could be powerful in setting scene (e.g. clanking weights)
Have a look at VR experiences: Notes on Blindness and the Under Presents
Starting adding in the 3D objects. Found it slow to get used to Blender and hard to do the cracking shoe effect. Feeling a bit pessimistic about how this will go without someone without 3D asset pipeline knowledge

24/10/2023

Watched Notes on Blindness which is very compelling and has some themes that I’m trying to cover in my project like creating liminal spaces with weather and expressing abstract concepts through sound and vision in VR. Reading the design notes it mentions it is an “intermittent and fragmented world” Be good to nail down what the world of the merry-go-round is. Makes me think I need to start thinking more about environment design and overall aesthetic.
Get the hang of finding 3D assets, using Adobe Stock, Sketchfab, and 3D printing site , find a good workflow for tidying up the assets in Blender and importing to Unity
After failing cracking the shoe in Blender yesterday I tried breaking the mesh in Unity () which works fine after a little tweaking.
image.png
image.png
image.png
image.png
Managed to get in all the 3D assets, they are far from perfect but can give an indication of how they might work. Fabrics (e.g. vest) could be interested for hand tracking. They all look very different, I did briefly play with using stylised shaders on the shoe (e.g. how Goliath and Notes on Blindness do) - this could be an easy way of 3D objects having a consistent aesthetic

25/10/2023

Tried out “The Under Presents” VR experience, which had a cool toon shader aesthetic and atmospheric environments, interesting locomotion and controller/hand interactions. I wasn’t as drawn to the narrative as “Notes on Blindness” or “Goliath” but it seemed more like a game and I wasn’t in the frame of mind to try figure out the puzzles so didn’t spend long with it.
Spent some time moodboarding narrowing down the audio-visual aesthetic, starting to think about the environment.
Switched over the Unity project to use Timeline for sequencing which should make iterating much easier (previously I was using scripts for timings)

30/10/2023 - 3/11/2023

- Unity: Add rain VFX on athlete story 7. Change sky colour as story progresses.
- Unity environment: add background weather sounds and skybox with fog
- VO: shortened narrator VO and used elevenlabs for AI VO. Plan to rerecord with real voice but AI useful for iterating
- VO: Ciara still trying to get hold of athlete stories without YouTube video music underneath, which is distracting
- Feedback: Ciara feeling the narrator VO and athlete stories now disconnected, narrator missing key info from research. Rebecca feeling that key info from research is not important for her in experiencing the Merry-Go-Round. I think this as a result of be not being clear about what the mix of sports psychology/expressive narrative I am going for, having now defined this (narrator VO delivers sports psychology that underpins the athlete experience, athlete VO gives emotive story that is backed up with expressive audio/visuals/interactions), it should be easier for my contributors to give feedback within that framework
- Writing: Rebecca wrote an intro/tutorial for the experience (added to )

06/11/2023

Reviewing intro/tutorial text, feels it needs storyboarding as some elements can be shown rather than told
image.png

07/11/2023

Storyboarded tutorial
image.png
Trying to make environment feel more expansive, trying to edit the skybox in Photoshop which helped but image is still really grainy in headset. Need to figure out how track lines will work. The feedback loop from Photoshop to headset is quite long, going to try Gravity Sketch
Gravity Sketch doesn’t feel like the right tool (was trying to paint a big blue environment around myself). Instead I just used a procedural gradient skybox from the Oculus samples
Created some tracklines in Blender using correct measurements from a track specifications website. They work well in focussing the attention of the user, giving a feeling of being grounded but the gradient background keeps a liminal space feeling. I’ve struggled with the environment design but I think its working well in the new intro scene
vlc_bSQybCchlE.png

08/11/2023

Build out intro scene, with text and athlete animation.
image.png
image.png
I got the athlete 3d model and animations from Mixamo. The athlete was too realistic for my liminal aesthetic, so I decimated (reduced the number of triangles/quality of 3d model) in Blender. I got help from Tosin with joining the animations together in Unity.

09/11/2023

Built out tutorial scene to get user used to hand tracking
image.png
Created glow effect by using two quads with gradient textures, one with additive shader the other with multiply shader. Quite a nice effect without the performance overhead of post-rendering.
Created a hand asset by going into Play mode in Unity, pausing it and creating a prefab from the hand, e.g.

10/11/2023

Review script feedback from Ciara, adding in some missing concepts from the research. Create storyboard for Merry-Go-Round, even though a lot of the scene is already built I found this process useful for intro/tutorial in syncing up VO, audio, visuals and interactions. I think its also useful for the other contributors to the project to understand my intent and give feedback.

13/11/2023

Updated narrator VO with script changes
Change merry go round skybox to use gradient
Add skip button to tutorial and intro

14/11/2023

Highlight track lines animations

15/11/2023

Add glow effect to all athlete stories
Add script to check when user brings object close to them to play sound. Add to athlete objects
Add breath FX sound to loop play on athlete object before it is brought close

16/11/2023

User test in London with Ciara
Adjust storyboard - remove some slides (detail about research), add some lines that add dramatic context (injury is devastating / injuries contribute to career trajectory)

20/11/2023

Add in clean audio for athlete stories (without music from YouTube video)
Create script for cloning dumbbells and piling them up, didn’t look right
Create dumbbell melt animation in Blender
Make AS8 show medal to pick up

21/11/2023

Figure out timings/priorities for last tasks
Plan user test 2 and contact possible participants

22/11/2023

Change shoe to fixed snap and use new shoe asset
Add intro hold scene (requires button to start)
Update VO script in intro Improve audio in athlete story 7 (storm)
Make objects consistently float away to fixed points at end of athlete story

23/11/2023

Fix gold medal staying stuck to silver medal
Rain athlete objects
Plan sound design for Rebecca to do at weekend
@Add better sounds for athlete objects
Ciara recorded narrator VO

24/11/2023

Add in Ciara version of VO - sounds loads better than AI one
Improve intro with VFX to illustrate athlete telling stories and infographics to

27/11/2023

Mostly distracted by discussion about change of venue from Arnolfini
Dominque tested and got stuck in tutorial as didn’t hear instruction - need to back up with text instruction
Improve tutorial intro
Rebecca sourced FX and background sounds - reviewed those

28/11/2023

Improve sound design with Rebecca’s new sounds
Add spatial sound to athlete objects to make them sound like the object is talking to them
Add text fades to intro text

30/11/2023

Add bring me closer text in tutorial to fix bug where users didn’t know what to do in tutorial
Add outro / credits
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.