Ben's Comp Newsletter: Issue 052
I hope you've had a productive start to the year so far.
This week's newsletter focuses on a handful of neat things that will help streamline your workflow, and help you create better comps!
Quick Tip: Decorate nodes that cause an oversized bbox.
This has been a feature since Nuke 11.3v1, but I had no idea it existed until recently! In Preferences > Node Graph, there is a setting that enables a warning on a node when its bounding box is larger than a certain preset threshold. When enabled, the erroneous node will display a thick red border with a black dotted line around it; any downstream nodes that carry the large bbox will only display the black dotted line.
If you haven't already, I recommend switching this feature on and paying attention to any warnings, to ensure your Nuke scripts remain speedy and efficient! Additionally, think about some knob defaults you can set on certain nodes to ensure your bbox stays small
(e.g. create a shortcut that creates a Merge node, sets its 'operation' to mask and it's bbox to A)
Lastly, with any node selected, you can run nukescripts.autocrop() in Nuke's Script Editor to automatically reduce the size of your bbox to its optimal size.
Thanks to Conrad Olson for the tip!
Finding great blood elements that travel with the same momentum as characters in your shot, and are lit under similar lighting conditions as the shot's environment, can sometimes be a tricky task. Adam Kelway has come up with a neat particle-based gizmo that helps us with this conundrum.
P_Blood_Hit offers an abundance of control when needed, but a general setup is as easy as:
Layer up a few variations of this gizmo with your blood elements that don't quite work, and you'll quickly have a pretty nifty solution!
- Choosing the most accurate preset.
- Tweaking the amount, direction & speed of the particles.
- Roughly replicating the lighting in your shot with Nuke's 3D system.
- Rendering a pre-comp with plenty of motionblur samples!
De-hazing in Nuke.
Last month, Mads Hagbarth Damsbo released a quick tutorial on a technique that mimics Lightroom's de-hazing feature using standard Nuke nodes. Simple technique, impressive results!
Unrelated: if you haven't been following Mads' Point Render project, you really should check it out. There are some groundbreaking new features in the latest update!
A VFX Producer recently posed the question: "If DeepFakes are so good, why are the best studios in the world investing so much money to create photorealistic digi-doubles?"
It's a great question, with an easy answer. Put simply, DeepFakes are automatically generated by training an algorithm with a dense set of reference imagery. After training, all a user can do is hit "go" and hope for the best. Although with no extra control, what happens when a client wants to subtly change parts of a character's performance? This has been the gold standard of DeepFakes, until now...
As a personal project, Thiago Porto set out to see what could be possible to do in comp with the help of AI / Deep Learning models. What he came up with can only be described as DeepFakes on steroids. Using enhanced Deep Learning algorithms, Thiago has managed to add morph targets & relighting capabilities to DeepFaked faces, offering a ton of flexibility, and getting closer to changing my answer to the VFX Producer's original question.
These results are nothing short of astounding.
Did you find this newsletter informative?
Have you created, or do you know of any outstanding Gizmos, Python Scripts or Tutorials that you would like to share with the global Compositing community? Please reply to this email, and I will do my best to include it in a future issue of this newsletter.
Support on Patreon
Ben's Comp Newsletter: Issue 052
is sponsored by Keegen Douglas
If you get value from reading Ben's Comp Newsletter
every other week, please consider contributing via Patreon
to help keep it running!
Thankyou to the following supporters
Shih Yi Peng
+ 2 others...