The new Stella McCartney and Adidas Fall/Winter 2017 collection is founded on ‘dynamic designs engineered for exploring different environments’. Bringing together two separate environments under the concept of exploration and adding the tangibility, movement and detail of fabric was a challenge we couldn’t refuse.
Visualise were asked to include city and natural world locations, as well as the 4 models styled in various pieces of clothing from the new line. We took this as an opportunity to try some complex editing techniques never used before at Visualise.
“With the new FW17 collection, every design has been considered so women can be creative and thrive whatever their environment, their workout, or what the weather is like. With this playful virtual reality experience we’ve not only communicate this, but it’s done so in such a fun and innovative way.”
Shooting in Stereo
To create a 6k x 6k stereo image at the highest possible quality, we shot the models using a nodal slicing technique. This involved recording 90 degrees of footage at a time, using a pair of Sony DSLRs, on our custom-built ‘Johnny Five’ rig.
The quality of the video needed to be in-line with the rest of the City X Nature campaign media, so it was important for us to try and capture the fabrics in the highest definition possible. The limited resolution of the Google Daydream headset (the platform on which the final project would be delivered) made this incredibly challenging. Our solution was to carefully monitor proximity to the camera, among other factors.
Creating continuity in outdoor conditions
Changing weather conditions meant that we had to work fast in order to ensure all plates could be seamlessly edited together. We had to be careful about exposing the camera, so opted for a minimal lighting setup that could be adjusted quickly. This was especially important during the nature shoot, where exposure had to be constantly monitored so that everything matched fluidly in post-production.
We wanted the ability to edit each of the 4 slices of 360 images independently, which required a custom workflow for Nuke, our editing software. The process required heavy reliance on scripting to run all 42 different footage parts through automated processes to render each view.
We layered all separate footage parts on top of one another and freely edited each view on the timeline using an alpha channel. This approach gave us the flexibility to edit each part however we liked.
In terms of complexity, this rendering job was one of the most challenging we’ve done using Nuke, but ultimately, we felt that the flexibility it gave us was well worth it. The colour grading was done with a brand-new VR-based colour grading suite from The Mill and was the first project to use it.
The finished video was delivered via an app which is used in conjunction with Google Daydream headsets. Wearers can instantaneously flip between city and nature to have more control over their experience. Over 600 guests from media and the fashion world experienced the VR collection launch from their own perspective at the Tokyo event.