The 2023 GDC VFX Roundtable covered a range of lively topics, including workflows/tools, lighting, expectations, and AI
By Keith Guerrette on Jun 09, 2023
Jun 09, 2023
Introduction
The VFX Roundtables at GDC each year are something both truly special, and slightly disappointing - they are the single largest gathering of real-time visual effects talent that I’ve ever seen, bringing with them an excitement, energy, and the shared overwhelm of an ever changing and expanding industry. At the same time, there’s simply no way to capture the depth and variety of the conversations that I WISH could happen amongst all of us.
For GDC 2023, I had the absolute pleasure and amazing opportunity to co-host with my good friend Jason Keyser (VFX Apprentice). Below are our collective best efforts to list notes from the discussions.
The Official GDC Listing of the Visual Effects Roundtable:
The round table discussions are broken into three days, each with a slightly different focus: General, Artistic, Technical. Each day began with a quick introduction of the topic, a reminder of the value of our shared community and participation on realtimevfx.com, our discord, twitter, etc, and most importantly for the hour - a ground rule to try and keep each topic to about 5 minutes, and allow everyone a chance to contribute and be heard.
Midway through the session (once people loosen up a bit), we would stop and ask for hands raised on three questions (I loved this, so props to Jason or whomever he had this idea from) - Raise your hand if you’re a student or just getting started in your career in VFX. Raise your hand if you’re looking for work as an individual or as a business. Raise your hand if you’re looking for VFX Artist support for your project. Now make sure you all meet each other once we end the session today. Interestingly, it felt like the room was split roughly into thirds each day.
Day 1: General
General topics and revisiting ground broken in the VFX Summit earlier in the week.
Sadly, flight delays and an early call time combined to prevent me (Keith) from being able to make this day. Fortunately, Jason Keyser is an incredible host and Sandra Femenía kindly took notes. First bullets were topics / questions asked to the group, sub-bullets are the answers.
Topic: Being extroverted while working from home
Stream on Twitch
Be involved with the online communities
AI in our Art?
A few people said they’re using AI Extensively
Others are using it or seeing it used as reference only, not a final product
VFX Jargon
Alpha clipping, etc having different meanings/uses from UE to Unity
Our specific jargon causing challenges with communicating feedback from misuse or absence of specific keywords
Is there a VFX Dictionary somewhere?
We should create a working group to document /host something like this on Realtimevfx.com
Examples of variations in language between engines:
COD Engine uses the name “Runner” for particles that follow other VFX Data. Bungie calls them Parent-Child particles
Keith’s addition while writing aggregating these notes: A Shader in Unity = Material in Unreal, while a Material in Unity = a Material Instance in Unreal
Graph Editors?
Don’t let artists use graphs because optimization concerns
Let artists do stuff on graphs and then have a TA optimize it
It depends
Material Functions are powerful friends to artists
Shader graphs are crucial
Bungie provides total freedom for artists and supports them with good tooling and profiling
COD doesn’t have a shader editor because they focus on an incredibly organized production pipeline with reusability and realistic VFX.
Houdini in Pipeline
Frostbite Engine (as used on Battlefield), uses houdini A LOT - mesh particles, pyro sims, data transfers, etc.
Houdini allows users to create whatever data you need.
Houdini is extremely important, our artists need to know it
We kinda started using it several years ago, but everyone was hesitant because artists were intimidated by the technicality of it
It’s so important to build seamless pipelines experiences between your engine and houdini
It allows for procedural approaches, which can scale very nicely into production
Hiring Portfolios - what’s important to demonstrate?
Soft skills, like communication and critical thinking are so important
I want to see curiosity
Artistic vs. Technical - show us your artistry first, and your technical capabilities second
Also keep optimization/performance in mind
Clarity and narrative - we want something that’s clean on screen.
How much do you use scratchpad in Unreal's Niagara?
I use scratchpad in order to save time when I find myself repeating the same functionality/setup again and again
An example of a module I created was to access g-buffer to erode particles based on screen color
Unit testing for VFX
Mostly test by hand. In Infinity War, they had unit testing with a box to run through all the map doing a heat map of framerate, but most places don’t have an infrastructure like that
Some setups record GPU cost over to per build to build graphs that you can compare, but the systems broke a lot and often weren’t usable.
Example was given of creating dynamic systems, such as Custom LODs depending on the amount of barrels exploding within a radius around each other.
When in the pipeline do you usually start making VFX optimization and performance benchmarking?
I suggest to do it at the end so VFX artists can still work instead of expending 3 days trying to optimize things that could not be shipped at the end
Lightning round:
You can use scratchpad to animate electricity in such a way that it dynamically interacts with the world around it
Documenting your project's shape language in Notion, Milanote or Obsidian for your team to check from time to time could save you a lot of iteration/feedback time during production
You can use scratchpad to make smoke that rolls outwards
In the tech artist roundtable someone mentioned that they used Houdini for directional harmonics
For deformation systems, you can pack information in the UV channels with Houdini
If you compile your shaders without having Unreal focused, the compilation will run faster since your system won't be rendering Unreal at the same time
Day 2: Artistic Lens
VFX through an Artistic Lens, focusing on attendee questions about the artistic aspects of VFX. This may include art education (as students or veterans), art direction, and more.
For this day, I was able to sit back, take notes, and participate as an audience member watching Jason work his magic.
For those that use simulations, what are the best practices that you’ve found over the years while art directing them?
It depends on what software you’re using. Art directors will bring requests to us, and we have to figure out how to translate and iterate’
Non destructive pipelines are very important
Can you build in controls while you’re initially setting up, so you don’t put yourself into a hole
Consider showing a “grid” of options
Start very coarse/broad, with as low resolution and fast of a simulation as possible - iterate completely through implementation in the effect in game so you can understand what’s working, then progressively add detail (and simulation time) to your simulation.
Vector Field Painter - VR App, allows the user to stand inside the shape and paint vectors around them and see the space
What other cool tools have you guys found?
Substance Designer has become very rewarding and useful
SyncSketch.com - very useful for reviews
klash.studio - we have a local sync sketch competitor here that says his tool is worth looking at as well!
ToonSquid on Ipad for frame by frame animation
Toon Boom Harmony
Krita
Flip a clip
Procreate
Tahoma2D/OpenToonz
Aseprite
ProtoPie
Goodnotes has a great shape tool for notes and diagrams on Ipad
Obsidian for reference / video boards, collaboration
How do you establish visual language for terms for Art Directed VFX / Guidance - “make it more magical, less fantasy is, more stylized, etc”. What vocabulary do we use?
A lot has to do with Shape language and readability - minor paint overs are lifesavers
Having a style guide is absolutely crucial so you can reference it as a source of truth - set those rules ahead of time and update them when your development team grows and defines new things
Draw overs are the most helpful way to translate from the spoken work to visual language, if it’s a still frame. Motion is harder
Motion requires people to also specify where in time are the points of feedback
Having a good team lead that works closely between the art director and the team to iterate and find the good
Documentation / Notes are very helpful to allow the entire time to reference together - audience suggested Notion, Milanote and Obsidian
If your resources are spread thin, is it better to put more effort into bigger effects and spread thin, or get coverage?
It depends on the project and needs
If you’re in school, and interested in VFX, you should make sure you focus on building a great presentation for your work, rather than spreading for coverage
It’s worth understanding why the effects are needed - sometimes you need the effects to make moments feel good, which can often be achieved without needing a ton of polish
Consider looking at the marketplace for assets to cover the easier and less important things so you can focus on the things that matter the most for your production
Make sure you put in the time to get the connections and gameplay logic in place - that’s often where a huge portion of work comes from, and good implementation allows you to work more efficiently.
How do you nail timing in your effect? What about simulated effects?
Consider the musical beat - make mouth noises and hear the timing?
Timing is always contextual - Block in the effect so you can see it in context and understand what’s going to work before you add details, especially with simulations
Simulations are very important to block in the timing before you add details, because it becomes so much harder to alter
Two techniques for simulation flip books that allow you to retime
Optical Flow / Motion Flow / UV Flow that blends between frames of your flip book
Stable Diffusion is really good at adding frames to a flip book if you didn’t get enough frames and don’t want to go back to render
This year's VFX Summit focused quite heavily on this subject, so everyone that can access the vault should check them out!
Consider playing with the timing looking at a frame render of your effect - it’s so much easier to add a frame or delete a frame to quickly experiment with timing
Quick intermission: Raise your hand if you’re a student or just getting started in your career in VFX. Raise your hand if you’re looking for work as an individual or as a business. Raise your hand if you’re looking for VFX artists? The room was roughly 3rd
When do you know if your effect is done?
“Your effect is never done, it’s taken away from you”
When I start noodling on something randomly, I step away, and then I can come back later and see it with fresh eyes.
How do you define noodling?
When I’m not getting much impact from what I’m doing, it’s probably a good idea to step away, shelve it until tomorrow, etc.
It’s important to have a good lead or director to help make sure there is a solid definition of success to feel an achievement for accomplishing the goal and keep everyone on track. Keep the energy going to where it’ll have the most impact.
FX in games get shipped when their 80% done or when the time runs out
We often have so much work to do, so we have to get intimately familiar with the phrase “Good enough”
How do you overcome the feeling of not liking your work when everyone else does?
If everyone else is saying your work is great - believe them! You’re so close to it, you often can’t see it.
How do you encourage everyone or yourself to do the sketching phase?
Have a sketchbook just for the elements/VFX, so there is a focus when in that book to focus on VFX things.
If you can’t find the time to sketch every effect, taking the time to find a lot of reference can help fill some of that need
Pinterest, if you can navigate it, can be a great place to dive into inspiration
Sketching can be a very useful practice for exploring and defining the timing, locking in the feel of the effect before getting lost in the simulation
You can only save time while you still have it.
If struggling with the “I Should” just force yourself to put something down in the editor and force it to be a “need” - that’s usually the kick needed to
Highly suggest looking at Elemental Magic books
How do you work with oversaturated environments where the lighting is blown out and everything is bright?
Consider backing your particle with a duplicate that has a “gray” value to give a neutral stage that you can control.
Make sure you test your effects in a variety of lighting environments, or at least in the final lighting context
What in your workflows creates time and causes the biggest headaches?
Designers - making sure your art direction is aligned with the design vision to ensure that you don’t have to throw away your work
The biggest waste of time is building an iteration loop to actually see your effects - ensuring that design helps create workflows while you’re working together is crucial
Testing for network games - this seems like such a pain point - any tips?
Everyone acknowledges that this is a particular fragile scenario that often causes very bad slowdowns
Recording video captures to watch your effect multiple times
Render Doc is a very useful tool to capture entire play throughs and so that you can find bulk issues
Ask your developer team to help support
Can the engineers help setup test scenes with bots to reproduce your scenarios for a very fast iteration process
Test gyms for the win!
Lightning Round -
Object based gradients by subtracting world position by local position and using it as a UV
There’s a YouTube tutorial for how to make a holographic display for anyone “Unreal Niagara 3d Hologram Effect”
Exploring cheap hacks to create 3d Noises in shaders
Use a very bright ugly neon green on assets that need cleanup later
Relaxed Cone Mapping in Niagara talk - “Approaching Technical Art Techniques Unconventionally Using Unreal Engine's Niagara"
Always use Eye Adaption to take into account exposure
WASDQ in Unreal Material grid aligns your nodes!
Align top: Shift + W
Align middle: Alt + Shift + W
Align bottom: Shift + S
Align left: Shift + A
Align center: Alt + Shift + S
Align right: Shift + D
Straighten Connection(s): Q
Day 3: Technical Lens
VFX production viewed through a Technical Lens, guiding the conversation through topics such as graph based effect authoring, effect lighting and optimization, new mesh based techniques, VR/AR/XR, and more.
Day 3 was my day to take point, and ultimately handle the immense responsibility of tossing and catching the awesome GDC microphone in front of a crowd.
What new tools are people using in their production pipelines for real-time VFX?
Blender
Embergen
FluidNinja
Maya is playing with AI compression of VDB Sequences
Is anyone exploring fully 3d/volumetric approaches in real-time yet?
Houdini and Embergen have fantastic workflows for VDB exports, but using them in the engine is still complicated - several plugins / teams are very close to having great solutions, so it’s near!
VR is still challenging/impossible to do real-time volumetrics, so there’s still a lot of faking
One team is tracking and offsetting the depth from each eye intelligently in the shader to create the illusion of depth in VR space
Does anyone else have any insights into “rear-projection” techniques like this or other techniques for VR?
Only a few people in the room are currently working on VR projects
How do people manage expectations when directors, managers, etc see offline-rendered content that can’t work at scale in a real-time production? (For example - seeing Unreal’s R&D posts about real-time fluids and directors ask for it in their game)
We have to be honest and make them understand that it’s just still not possible.
One person’s creative director asked to be shown how to make VFX so he could know what is and isn’t possible, then that became a part of their culture - that future leaders all had to understand enough to have context about their asks.
Lighting and environments are looking amazing in game engines, but VFX are not quite there yet - that’s causing harder conversations when level-setting expectations
Need to push back and manage expectations about the limits we still have with VFX
Gen 5 is still not able to do everything - most of the budget improvements went to lighting
Having a graphics engineer around can help with these conversations, and find ways to help with the problems!
We may have ideas as VFX artists, but the engineer can provide a lot more context about the limitations and reasons.
Are people using out of the box engines, custom engines, or engine extensions?
Game architecture of Unity is amazing as-is
Graphics advancements are best done custom
What we see in the demos isn’t what you get right away
It’s so easy to waste time on all the cool things, or trying to keep up with things that don’t matter - you need to ask what is critical to your game specifically and focus your energy there.
It depends on the team size- some places can’t spare the resources to create custom solutions to all of their problems.
Custom tech also has an upkeep cost, especially in Unity and Unreal - you’ll have to spend time upkeeping every time you get a new engine version
A graphics engineer said that in their development with an in-house engine, they like to prototype in Unity first to understand what is needed, then implement the final result in their own engine.
They focus on making the tooling as efficient as possible, then the artists come in at the end to finalize.
When you see a fancy new thing as a company, how do you decide whether you’ll pursue it?
If it’s an unknown, then take the low-hanging fruit to prove an idea first
Are there performance issues? Visual issues?
If it’s 3rd party software, asking how usable it is, how supported it is, etc
Focus on the problem you’re trying to solve, not the tool
One company did monthly VFX challenges to get their art teams exploring and using new tools and techniques in a safe space, then they could make educated decisions for their production
Could be combined with other disciplines as well to make micro game jams
Tools can be a trap
Good tools don’t replace good art
Networking is a great way to discover what works for others and get new ideas
How do you deal with the constant updating of software and the visible frustrations, insecurities, and exhaustion of teams with new software?
[At this point, I interrupted and asked for a show of hands - ‘We don’t talk about this enough, but since we’re all here in the room, I think it’s worth it to call it out: How many of you feel uncomfortable as artists on a regular basis?’ (or something like that) Everyone raised their hands.
An instructor for a neurodivergent school mentioned that he has trouble with students who like stability
He’s had to really lean into the idea that it’s more important to get the project done than to use the “correct” tool
Film industry has more diverse coverage of specialists within their VFX craft
It’s healthy to let people specialize
They can learn by other VFX artists next to them
Lots of people want to discover new things so they can do their job better
It’s just easy to lose focus on the part that matters - art over tools
What type of Profiling tools do people use?
Pix
Zone Console
Renderdoc
Niagara has great internal profiling tools
Does dumps and time graphics, spreadsheets, etc
Common things to watch out for / consider when you find frame rate hitches caused by VFX:
Overdraw
Too many live physics simulations
Too many VATS in memory at once (Large textures!)
Track down the likely suspects first and toggle them off to verify it’s them.
Build best practices along the way, so optimization doesn’t become such a big chore at the end.
Poll - How many people are authoring in Unity, Unreal, or Proprietary engines?
Pretty even spread throughout the audience
How are we solving for lighting on our particles in physically based render pipelines ?
Setup a series of rooms that go from the whole EV and lighting spectrum and make it easy to jump between them so you can see your effect in the full spread of lighting environments.
Name the particle with the use case intention (low/med/high) to control bloom
Work with lighting team to figure out what needs to be adjusted between each
Setting particular ranges of exposures for VFX vs. other aspects and authoring to fit within those ranges
It’s very challenging to retrofit these systems
Don’t forget to also consider color management - ACES is an industry standard for film and it’s starting to creep into games - someone mentioned using an ACES pipeline in Unreal with great results because there is still data in the highs and lows.
Someone suggested calibrating your monitors regularly
What issues are people having with Lumen in Unreal?
There’s no way to control the diffuse directional lighting intensity on your particles, so you’re stuck
Different light types create different values of shadows - watch out for this
Have any of you encountered positive use cases of AI for VFX? (this conversation immediately became philosophical first)
For the past several years, large companies have been betting on previously established IP to ensure profits, so smaller companies have had a difficult time producing enough content to compete - AI might be able to help with this
Ethics and jobs are big concerns for everyone
The hope is that people will be empowered to iterate faster and focus on better craft and end results
Indie devs will be able to do much bigger things than they could before
Creativity in how you use the tools will continue to be critical, regardless of the future
What type of Profiling tools do people use?
Pix
Zone Console
Renderdoc
Niagara has great internal profiling tools
Does dumps and time graphics, spreadsheets, etc
Common things to watch out for / consider when you find frame rate hitches caused by VFX:
Overdraw
Too many live physics simulations
Too many VATS in memory at once (Large textures!)
Track down the likely suspects first and toggle them off to verify it’s them.
Build best practices along the way, so optimization doesn’t become such a big chore at the end.
Poll - How many people are authoring in Unity, Unreal, or Proprietary engines?
Pretty even spread throughout the audience
How are we solving for lighting on our particles in physically based render pipelines ?
Setup a series of rooms that go from the whole EV and lighting spectrum and make it easy to jump between them so you can see your effect in the full spread of lighting environments.
Name the particle with the use case intention (low/med/high) to control bloom
Work with lighting team to figure out what needs to be adjusted between each
Setting particular ranges of exposures for VFX vs. other aspects and authoring to fit within those ranges
It’s very challenging to retrofit these systems
Don’t forget to also consider color management - ACES is an industry standard for film and it’s starting to creep into games - someone mentioned using an ACES pipeline in Unreal with great results because there is still data in the highs and lows.
Someone suggested calibrating your monitors regularly
What issues are people having with Lumen in Unreal?
There’s no way to control the diffuse directional lighting intensity on your particles, so you’re stuck
Different light types create different values of shadows - watch out for this
Have any of you encountered positive use cases of AI for VFX? (this conversation immediately became philosophical first)
For the past several years, large companies have been betting on previously established IP to ensure profits, so smaller companies have had a difficult time producing enough content to compete - AI might be able to help with this
Ethics and jobs are big concerns for everyone
The hope is that people will be empowered to iterate faster and focus on better craft and end results
Indie devs will be able to do much bigger things than they could before
Creativity in how you use the tools will continue to be critical, regardless of the future
Closing Thoughts
As we wrapped up the final day of discussions, I was struck by a few conflicting thoughts. For our industry, I noticed a general theme across the entire conference that was focusing on tech artist solutions rather than engineering solutions. Before anyone throws their coffee at me, let me explain - when I began attending GDC, many of the most exciting innovations on display were paired with extremely programmer-centric presentations and discussions. This year, everything cool and exciting was paired with either 1) instructions on how to procure/install a 3rd party app that will ‘make all your dreams come true’™ or 2)guidance on how you, as a slightly technical artist, can create this solution yourself. I felt that this GDC was filled with artist empowerment.
The conflict in this thought, however, came in the realization that we are well on our way to an industry where sub-specialization as artists needs to be embraced. We’re very likely to start seeing needs for a team dynamic include an artist that excels in ‘pre-engine’ fluid simulation pipelines, and an artist with houdini tool building expertise, and an artist that knows how to customize Fluid Ninja, another that can write Python integrations in Unreal, and a 5th that builds real-time shaders and blueprints. Each equally important, and exciting, but damn I couldn’t help but feel old and miss the ‘wild west’ days of trying to do it all without killing myself.
Now, for the Roundtable itself, I had another pair of conflicting thoughts: On one hand, this was an incredible experience - so many talented, diverse artists and developers in one room joining for a shared conversation around our passions, and it was exciting to feel! On the other hand, I wanted more depth, or more useful takeaways out of these conversations. It’s entirely likely that this is just a limitation of the roundtable discussion format- there’s a probably valid argument that the roundtable is supposed to be paired with or spawn further, deeper conversations, but it on its own has to be broad.
Ultimately though, GDC and our special VFX summit/round table events left me feeling so proud of the incredible community that shows up each year, or each day on our forum, discord, facebook, etc to share, inspire, and mentor. Thanks everyone for participating in these conversations with us.