Ben's Comp Newsletter: Issue 014


I want to take an opportunity to express my gratitude to you, the reader of this newsletter. Since starting this little project in January 2018, the readership has grown every week, and has exceeded all my expectations! What I can gather from this information is: Compositors constantly crave more knowledge & seek opportunity for self-improvement!

My goal is to share & contribute knowledge, tools and techniques to the global compositing community, so we can all improve our skills & efficiency together. If you find this or any future issue of this newsletter useful, please consider sharing with a friend or colleague.

Now, let's get on with some learning!

This article was sparked by a good friend, and a junior colleague, who both approached me with similar questions of, "How do I become a better Compositor?" This is something I believe we should all ask ourselves from time to time, no matter our years of experience, and is a question that doesn't restrict itself to VFX but instead applies to all walks of life.

Whilst on the more philosophical side, I feel this article is relevant to everyone, and I hope my thoughts on the matter inspire you to reach for greater heights.

Click here to read the article.

This Python script changed the way I work. Rather than having to keep track of where the first and last frames of my rotoshapes are to set in and out points after the fact, this script completely automates the process! Huge props to Satheesh R for creating this.

Click here to download on Nukepedia, and Click here to read more about how it works.

It's important for Compositors to understand how things work in the real world, and how they're captured through a camera's lens. The physics behind how a lens refracts light is a topic that fascinates me, so I've compiled a few of my favourite resources for you to learn from!

Jed Smith has a great lecture on how to create physically accurate depth of field in Nuke, which you can watch here. Jed also offers a gizmo which you can download on Nukepedia, which encompasses this theory into a gizmo.
(Note: if you use pgBokeh, it works in a similar way. Use whichever node gives you the nicest result!)

Secondly, this article by Tyler Britton focuses on overcoming the challenge of correctly defocusing an image with transparent objects using a standard zDepth pass and a matte. He presents the theory in a simple & easily digestible manner, and provides a gizmo (via a Python script) which you can run to achieve the results he shows in his video demonstration.

This is one of those technologies that needs to find its way into VFX pipelines ASAP!

It works by placing tracking markers on a surface in infra-red ink, so it's invisible to the human eye (and a regular camera), but can be picked up by an infra-red-compatible camera. In this example, they're capturing IR data at 1000fps & projecting a deformed image back in real-time with a high-speed projector.

Click here to watch the video, or click here to read more about the technology.


If you've created a gizmo or python script to solve a common problem or speed up your workflow, please reply to this email and let me know about it's existence! I'd love to help spread the word, to help us all be better compositors together!

Click Here to view previous issues.