Isaac Marchionna

Zero Dark Nerdy by Isaac Marchionna

IMG_3228.jpg

Fundamentally there's a few ways to film into the darkness of an evening. Option 1, bring a lot of lights. Option 2, bypass the whole "visible spectrum" and go right into Infrared. That isn't to say Option 2 isn't without it's quirks. All of which presented an interesting learning curve when utilized on a project for a company that specializes in illuminating the darkness. 

IMG_3226.jpg
IMG_3230.jpg

Our client asked myself and my creative partner to film a night 'live fire' rifle shoot using night vision, utilizing their own in-house night vision system. This client specifically creates IR aiming and illumination systems from individual soldier rifles, all the way up to crew-served machine guns. The advantage is that this allows us to give 1-to-1 feedback on how we think their night vision system can be improved from the standpoint of filmmakers.

The BE Meyers OWL night scope is fairly unique in a few regards:
- It utilizes C-Mount lenses
- It creates a non-vignetted image on full-frame sensors

Why the two aspects listed are important is that normal systems, for example the AN/PVS-14, is limited to 40 degrees of field of vision. Suffice it to say this isn't very helpful. The PVS-14 also has two modes of adjustment, either a front objective lens for focusing, or a diopter end for adjustment to the human eye. Things get a bit hairy when you have to adapt a system developed for fighting, to one for fun. This involves step up rings, adapted to a donor lens, so now you have 3 systems of focus to nail, an objective end, a back-focus end from the night observation device (referred to hereafter as a NOD), and finally the focus of the donor optic connected to the NOD.

Simply put...this is a mess. And is an absolute soup-sandwich for filmmaking.

So when presented with the BE Meyers OWL we overcame a few problems right off the bat. First, we get the ability to interchange relatively low-cost lenses, with adjustable IRIS', repeatable and fixed focus system, and one step mounting. In the case of the OWL it was as simple as specifying that we wanted an Canon EF mount (swapping from PL and EF on the RED's side was as simple as removing 4 screws, swapping, and reinstalling).

Overall this means one system to achieve a desired result. As long as the back-focus of the OWL is set correctly this means that you can remove and reinstall reliably without any issues on the EPIC or Optic.

 

Example: 'Zero Dark Thirty' (ARRI Alexa + NOD)

Example: 'Zero Dark Thirty' (ARRI Alexa + NOD)

Unfortunately we also learned a few downsides. The biggest being is that in this project's case we didn't learn till the night before that it used C-mounts, which while a pleasant surprise, wasn't a type of lens we had kicking around. So we were locked into the supplied 50mm. On the night of we simply rolled with the focal length we were dealt. But it does allow for intriguing future opportunities to source more C-mounts in shorter and longer focal lengths. C-mounts also tend to be fairly affordable and fast (this lens was an f/stop of 0.95).

The other technical issue is that it's all focus by hand. This means pulling focus or iris pulls becomes a fun game of guess work. A good quality NOD tube, the photo-electric plate that actually pulls light in and amplifies it a million times to a visible level, is running at about 600-800 lines per inch. This means you're essentially filming a barely 720p image through a 5K sensor, then outputed to a field monitor. As a result it's pretty tricky, but not impossible to get solid focus.

The C-mount lens is also un-geared, which prevented us from utilizing our ARRI Follow Focus, which in combination with a larger diameter focus wheel, would have made focus pulls less of a head scratcher. And because we only received our optic the night before, we weren't able to source any zip gears. After shooting that evening our hope is to see about either sourcing small enough zip gears, or having a set of delrin gears machined, and press-fit, on to the C-mount. This combination would essentially allow us to drive the lenses in the same manner as a normal DSLR or Cinema lens.

Another interesting facet of night production is that you're constrained to using either the natural moonlight (albeit amplified by a factor of a million) or additional infrared light. The OWL has it's own IR illuminator, but this amounts of an IR LED. As a result it tends to overexpose anything within a short throw of the OWL, without really getting the light where's it needed in the case of outdoor environments. Thankfully the products being filmed were incredibly powerful IR illuminators/designators, which meant that by essentially bouncing the light using one of these weapon-intended lights sources, we could kick additional IR light where we wanted.

Our schedule was fairly tight, and future testing would be nice to see how these IR illuminators would behave if used in conjunction with flags or reflectors. There's a whole world of possibilities we just didn't have time to test against. Next time for sure.

Example: PVS-14 + Canon 5DmkIII Source: Roy Lin / Weapon Outfitters

Example: PVS-14 + Canon 5DmkIII
Source: Roy Lin / Weapon Outfitters

None of these are really complaints, but rather areas I see as improvements to what is normally a very finicky proposition when using other systems (PVS-14). But the results are rather spectacular. The OWL doesn't create the typical 'image in a donut' effect that we normally associate with NODs hooked up to cameras (example: Patriot Games). Instead the OWL fills the entire frame with a glorious green image. As a result this gives us the option to later on add back vignetting if we're trying to simulate the point of view of a soldier seeing through a NOD. But for the purposes of a product video it's certainly a requirement not to waste half your image with just straight blackness.

At the end of the day, or rather night as this case may be, it's an interesting experience to eschew conventional optics and visible light, in favor of systems designed for the military. The results were exceptional, and combined with the RED Epic allowed us to get some shots typically not seen on video. We feel that our 36 hours with the OWL were short, but provided valuable take-aways that we can come back to, and improve on.

Cut of Your Jib by Isaac Marchionna

IMG_2942.JPG
IMG_2951.JPG

This past sunday I was given the opportunity to work with two local filmmakers, Sean Brown, and Tim Jankowski. Sean being a local cinematographer, and Tim being a Jib owner/operator. I don't get enough opportunities to fly my camera on a Jib, and Tim was looking for an excuse to get time with his new TALON head, which is a 2 Axis motion control head. This also allowed me to get the chance to run my Wireless HDMI kit, to see how it would function in advance of a shoot in July that both Tim and I will be working in concert in. Normally when I've done jib work I've played purely the role AC, controlling focus, iris, and zoom. This however was an interesting chance to get some time controlling the Talon head. Which uses two handwheels to control pan and tilt. And the experience...is well...interesting.

"The best way to describe it...it's like rubbing your tummy and patting your head, only you're trying to do it 60 feet away from your body, and everything is reversed..."

The Talon head is interesting in that it allows for recording of the operators X and Y movements, and then can play these back in real time, or over a longer period of time. Where this becomes extremely attractive is in motion control work for crowd replication, time lapses, etc. Basically if you have something with a Mitchell mount, this head will go on it. The mind boggles with the amazing things you can do, especially with a head of this weight capacity. Now, as for use...I'll say that Jib operators get mad respect from me, as either lacked the coordination, or the time, to fully acclimatize myself to trying to pan and tilt the camera using two separate controls. Simply put, the entire process of Jib operation is a total concert between 3 people, the AC (myself), the grip (Tim), and the Jib operator (Sean), and if any one part is late, or lacking, the entire shot falls apart. 

Now that said we filmed over in Brooklyn park, and besides a few raised eyebrows, everyone in the park were game to come over and see what were doing. This provided us with some child actors who provided moving subjects for all of us to practice.

 

Considering we were all coming into the Talon head as newbies the result isn't half bad. There's a few issues for sure, but this was done about an hour after setup. And the majority of time came from issues relating to debugging the camera, remote start/stop, and sensitivity on the hand-controls for the Talon's controls. I was very pleased that the Nyrius ARIES Wireless HDMI kit worked perfectly out of the box, and looked as good as the SDI feed would have. Plus it was totally wireless, which is frankly dark magic to me. I can see this kit getting a lot of love for when on dollies, shoulder rigs, and of course Jibs.

This won't be my last time on a Jib, as I'll be pulling focus with Tim on a project in July that looks to be a lot of fun. And this was an excellent opportunity to debug some camera issues so that we weren't doing so on a client's time, as well as dreaming up some awesome uses for what is a very awesome piece of motion control gear.

1006179_10151842882965312_606169011_n.jpg

Thoughts on Cinephotography by Isaac Marchionna

In 2008, we were all swept up in the wave of DSLRs that were powerful enough to start making videos. A few years later we now have video cameras generating still frames good enough for print. Companies like RED define these new cameras as DSMCs, or Digital Still and Motion Cameras. I've had the chance to now work on a handful of projects utilizing the Epic for the purpose of primarily stills, with the benefit of capturing 24-120fps on the side. I've heard this defined as 'Cinephotography.' And while I think this is a peek into the future, we're not all flying around in hover-cars and silver jumpsuits...yet.

"I do however think it’s always important to look at the cutting edge technology available today – to see what we might all have in our hands tomorrow." - Vincent Laforet

The benefit to this process is pretty simple to explain, given the limitations of high-definition cameras, which can only record in 1920x1080. Yet we can now grab single frame images that are on par with those generated by professional still cameras, by using systems such as the RED ONE/EPIC/SCARLET. You can certainly grab frames out of footage from your Canon 5D, but then it's 8bit, usually not that sharp, and almost assuredly degraded by the codecs involved. What we see with the EPIC is the ability to capture video, and stills, all in one go. This assumes the image is in focus and properly exposed, treat each frame as a potential photograph.

This allows us to work smarter and not harder in principal, because you're producing 14-19 megapixel images with a huge helpin' of video along with it, because they're one and the same. That said I'm starting to fall on the camp that believes that this only works if your primary mission is to shoot video with the side benefit of getting amazing still frames out of it. 

8347070331_cb112a33b8_b.jpg

Why I say this is that I don't believe these DSMC systems (RED EPIC, and soon systems like the Black Magic 4K, and Sony F55) work nearly as well for photography as a dedicated photographic system, such as your Nikon/Canon DSLR. And don't come close to competing with medium format digital systems...yet. The reason being is workflow. I feel that as a photographer you know when you have that moment, call it whatever you want, 'snap,' 'capture,' etc. You know in that instant, and looking at that LCD on the DSLR is confirmation, that you have what you're looking to achieve. With a system like the EPIC you're burning through gigabytes of data per minute, with the knowledge of having to scrub through anywhere from 24 to 120fps just for one second of footage. This presents two challenges; time and storage. 

Time is the biggest one. As generally on these projects we're not paid by the hour, and because of that we have to be able to quickly identify, and begin working with those assets that we produced. With a still camera we're producing a few dozen, to a hundred images per usable image. In this cinephotography style we're now producing 100-to-1000x that much information. So the ability to sort through it becomes the biggest drain on time.

In addition to that creation of information comes the issue of storage. You're now generating a half a terabyte of data on the low-end, for what could yield only a handful of production ready images. Outside of sorting through the data, you and the client need to know going in that you're essentially starting with an iceberg with the intention of chipping down to a few ice-cubes.

So where's the benefit then? Where these systems pay off is by reversing the workflow. I don't believe systems like the EPIC can beat DLSRs when they're matched up purely as still cameras, with the goal of only producing still images. DSLRs have had decades to be mostly-elegant executions of capturing still frames in time. However if you come at Cinephotography from the perspective of producing video first, knowing that you can extract quality stills after the fact, then it is an unbelievably powerful tool-set. I have run out of fingers and toes to count the amount of times a client has asked for still frames for websites, magazine articles, or other print publications, to realize that 1080p won't cut it for anything but reduced web deliverables. So obviously it's fantastic for being able to cut a video, and then provide a client with high quality stills on request without compromising anything in the process. Where it really becomes a quantifable selling point to the client is the ability go out and shoot a project, knowing during production that you're also generating still captures on the level of quality as a DSLR. I think it's that 180 on intention of how you're using a DSMC system that takes it from a headache to something that becomes incredibly valuable to the client. We reduce the amount of time, and cost, of having two different camera systems, while generating a product equal to the quality of each respective system. 

If that priority of making beautiful video is your goal, then the payoff of creating still imagery becomes part and parcel, and you end up with two amazing deliverables for the price of one. I believe if you can keep that process, and mindset, then that's where you see a positive convergence in technology. And at that pont the line separating those two technologies starts getting really blurry in the best of senses.