With the planning I did for my ancillaries I used a large amount of software to try and replicate the kind of images and ideas that I wanted to portray in the final products. My main tool for planning out my ideas was Adobe Photoshop 2015 and researching up on different magazine or poster ideas and replicating them using the skills I had with this technology to try and make it my own. I felt that I would prefer this method of planning as opposed to written as I did consider it but I felt it to be too unorganised, messy and unprofessional so I felt that seeing as I had Photoshop already and 3 machines perfectly capable of using it (MacBook Pro, iMac and a home-built PC especially made for the purpose of media production) it would be the preferred route to go down.
Some of the designing when creating the ancillaries is shown on the blog with a time-lapse that I created of when I manipulated the nose image to make it my own and more suitable for my posters. I wanted to do the same kind of time-lapse with when I was editing Echoes of Mine but was unable to capture it as I use two different monitors when composing my timeline and effects, one for all the composition and editing panels such as the timeline, project bin, effects panel and sound levels (which also happens to be a 21:9 aspect ratio monitor which allows for this to be a much easier process) and the other displays a Program Monitor, which just displays the video that I am editing.
Planning the actual story behind Echoes of Mine was a process that went on for a lot longer than the time that we have been working on this unit, as myself and Noah Twine had the idea of some kind of dream-based narrative in a short film since late 2013/early 2014 and had been utilising an online service called StormBoard where you can essentially create a brainstorm where you can link and relate certain ideas to each other. We were using this website for quite some time and drafting scripts using Celtx, right up until we released that this unit could make the project possible, where we sat down and created a finalised concept for the production.
Just before the final concepts were made, still during the brainstorming stages, I started playing with special effect in Adobe After Effects and made a small briefing post about it on the blog, where I spoke about how I was trying to make extra-physical representations of things like water to see if we could manipulate that in our film
Soon after this myself and Noah designed a concept for a draft video with footage that we found online to replicate what kind of production this would be, along with some dialogue and quotes based around dreams and dream theories. I left it up to Noah to compose the video using Adobe Premiere Pro CC while I started to think of some more ideas about the main production itself.
Almost instantaneously after that video got released we organised test shots to be filmed with our other group member, our chosen actress, Darcy Owen, and our DP, Craig Messum, to make another draft teaser video with our own footage instead. We greatly incorporated technology into this for obvious reasons, but the main features were 2 different DSLRs to capture the main footage and also behind the scenes, along with multiple lenses, filters and attachments for the camera and multiple different pieces of lighting and reflection/absorption equipment, which all proved to be useful in one way or another during the filming process, most of which you can see in the behind the scenes video that I created on the blog. Our other group member, Jonny Haines, could not make it to that shoot, and therefore had no part in the production of that one video but instead focused his time on research.
We did still use audio that wasn’t ours to show what we wanted to produce as at the time we wanted some kind of dialogue or a poem to be read over the top of the production which is also based around dreams, but it was later decided that a dialogue-less production with more focus on the cinematic and non-diegetic side would benefit the production more.
After those shots were completed and we decided on what the final product would be like, Twine started sending me motifs and ideas for original pieces that we could compose together to create a soundtrack that was completely original and fitted especially for the purpose of the film. The software that we were using is called Logic Pro X, with some extra plugins that I invested in myself by Waves Audio, that I use in my home studio where the final pieces were created and named “Hiatus” and “Reverie”. After realising that I had made the final touches and major edits to these songs ideas to suit my own personal edit, Noah thought it would be easier and more appropriate to create or buy his own music for his personal edit, as it would have made the production process a lot easier for him.
The logo that I use now for Oslo Pictures, my own personal ‘film company’ (or rather the title I like my work to be associated with when discussing my said works in the film industry) was designed by Twine, as I knew he was extremely talented in graphical artwork. Despite only wanting something simple I knew he would be the right person to ask as he takes it more seriously than anyone else I know, with a large desk-sized graphics tablet and 2 computers loaded with Photoshop, GIMP and many other programs that he uses for this kind of work. It was based off a design a found online, but I wanted the Bear to be sitting as opposed to walking, like in the example I gave him to work from.
Before the day of shooting for the production I uploaded a list of the equipment that we were taking with us onto the blog, which included all the cameras we were going to take, all the equipment that comes with them and also all the memory cards, batteries and other small items we needed to take. On the day, before each scene was shot, myself, Haines and Messum would do some location shooting to showcase all the possible areas that we wanted to use for each scene using our DSLRs and mobile phones where respectively appropriate. All the scenes were shot with careful consideration to the script, despite the lack of camera directions in print (as we wanted to leave it more up to group decisions and ideas on impulse on the day) which we all had in paper and on our mobile devices so we could bring up anything at any certain moment.
After filming was complete we had all decided that we should all create separate edits of the film, so that we would all have to apply our own skills to our own work, as opposed to leaving certain jobs down to certain people. I used Adobe Premiere Pro and Adobe After Effects to construct and edit the film, as well as going back to Logic Pro X and making some changes to the compositions when needed. All drafts and updates were uploaded to YouTube and this blog.
In the evaluation I have used the video format to show my audience feedback for my main task, where I have recorded myself as a voice-over talking through what comments were made about the film which I recorded with my Rode NT1-A running through my Focusrite Scarlett 2i4 Audio Interface and into Logic Pro X and Premiere Pro, once again. I was planning to do this for all of my tasks including Ancillaries but by calculation it would’ve proved to be too time-consuming and recycling the exact same type of media form not very creative, so instead I used Prezi (an interactive presentation platform) to showcase all the comments that were made about my poster and my magazine, which I feel was also a good way of expressing the comments that were made and how I feel about them as I could show small screenshots or images that were directly referencing to what I was talking about.