I managed to get done in this software some ocean meshes for some multiplayer maps in Gears 4, blood textures for Red Dead Redemption 2, oil splashes, etc. My most used software is 3d studio max mostly to create low poly meshes, UV Mapping, rigging, etc. Then we have also Adobe Photoshop to tweak some rendered images or create some textures from scratch to apply to the particles. Most of my experience being a VFX Artist so far has been working on the gameplay side of the effects.
That said, most of the time I was given an animation I had to stick to and add the extra layer of effects blood, magic, impacts, explosions, etc. When working with animations, usually VFXs are attached to bones, helpers, objects, etc that are in the scene. This is a back and forth workflow between the animation and the VFX teams where the latter can ask for changes or additions to the sequence such as adding extra keyframes to make a sword trail smoother, rotate a helper to use this rotation as the direction for a blood splatter or change a camera cut to avoid compromising point of views for a mesh particle, for example.
Also, there are some features from the game engine that can help for the animation or the dynamics of the effects. For example, when I was working at Mercury Steam, as we were using proprietary engine and I was very close to the core technology team, I was lucky enough to be allowed to request new engine features.
I benefited a lot from it as I could ask for new ways to link effects to a bone: I was able to link an effect to a bone using its position but not the rotation or taking the bone position and rotation but using always the Z height of the ground very handy for fire linear attacks on Castlevania.
I remember it was especially satisfying when one of my requests was implemented without the programmer being very convinced to if what I was asking for was going to work. I asked a way of setting the initial velocity of the particles over the lifetime of the emitter. This way, using this method with a trail, I was able to get a blood spurt that never disconnected from the point of impact.
This extra layer of meticulousness added the viscosity feel I was demanding and I was very proud of it. One of the big tasks of a VFX Artist when it comes to implementation of the assets in game is the optimisation. The word we say and listen to the most is overdraw. Usually game engines render the final image in passes albedo, roughness, metalness… and every time there is a particle on the screen, depending on the rendering type of it, it impacts the rendering pipeline.
For example, if we have 2 translucent particles on screen which are overlapping, the rendering cost of the pixels which are overlapped will be double. Here is a good thread to know more about overdraw. Most of the time our elements are not the most complex ones: mesh particles have low triangle count, shaders are not very complex usually , particles are lit in a much cheaper way than the characters or environments, some times particles are not lit at all, etc.
Even with that in mind, particles can get very expensive at runtime due to the way they are rendered. Roughly there are 3 major ways of rendering the particles in Unreal called Blend Modes :. This is the cheapest method because particles overlaying others occlude them so the engine discards pixels more easily.
Alpha masked: this is also called alpha test where we have 0 or 1 transparency values. So we either have visible or invisible pixels, no semi transparency at all. This is the cheapest way of achieving transparency, at least in console games. The opaque pixels on the sprite occlude the pixels underneath just like the opaque method explained above. Translucency: this is also called alpha blend. This is the most expensive method to render a particle.
Unlit: particles are not lit at all. All the lighting has to be baked in the texture or do it via shader not totally accurate. Lit: particles get similar lighting to the environment and props. Enabling lit shading method on translucent particles adds an extra layer of complexity to the render of every particle. Adding lit shading method to translucent particles allows semitransparent particles to store Z depth info.
This is expensive per se, but nowadays graphic cards can handle it if used carefully. Use the texture space wisely. When possible, avoid too much transparent pixels in the texture trying to get the opaque pixels as close as possible to the borders. VFX Optimization guide. When we were working on Gears of War 4 multiplayer maps we had to be specially careful with the performance impact of our work. Our goal was to get p at 60 fps so we had to be clever to achieve the quality it was expected but always getting smooth gameplay.
There was a level with a grass field and after the profiling phase, the technical art team pointed the grass to be the most expensive element on the map. So after some discussions between environment art, technical art and vfx art departments we decided to try one technique. I was in charge of, somehow, make the grass sprites so that the further they were from the camera the less transparent pixels they had.
I created a diffuse texture covered with grass blades but the opacity was done in a way I had groups of blades with different grey values. This way, the shader increases the value of that grey the further the camera was so that the blades where visible. As the shader was set to alpha masked those blades only were visible when the alpha reached 1.
This way we saved tons of overdraw in the long distance. Another example of optimisation and tricking the player is deleting the particles when they get close to the camera.
We did this on Gears 4 too with smoke particles, lightshafts, etc. In the shader we did that the particles became gradually transparent when they were at X distance from the camera set in the shader and when they were totally transparent we shrank the vertices. There are also some breakdowns around there than can be easily followed by anyone interested in the field and has a basic knowledge about some 3D Packages. Also, the official Unreal Engine 4 website is a good spot to learn:.
It is the right time to enter this lucrative and rewarding industry if you are passionate about movies and visual effects. This has been a guide to What is VFX? Here we discussed the concepts, education, training, and how VFX makes it easy.
You can also go through our other suggested articles to learn more —. Submit Next Question. By signing up, you agree to our Terms of Use and Privacy Policy. Forgot Password? This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy.
By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy. What is VFX? Popular Course in this category. Course Price View Course. Free Design Course. Login details for this Free course will be emailed to you. Email ID. Understand your data by reading our Privacy policy. We use cookies to personalise your content, analyse our traffic and improve your browsing experience.
You consent to our cookies if you continue to use our website. Read more about our use of cookies. HitFilm Pro. Try HitFilm Pro for free. Latest Blog Posts. How to make a movie right now October 19, Anatomy of a Film Production Crew October 18, New Release: HitFilm New Release: Imerge Pro Interested in working in the film industry? Love all the amazing visual effects you see in films? Perhaps a role as a VFX artist is for you! Steven Spicer August 30, am. What is VFX? What does a VFX artist do?
VFX Specialist Someone who specializes in a particular area like 3D modeling, animation, or concept art. VFX Generalist Someone who does a little bit of everything. Share on facebook Facebook. Share on twitter Twitter. Share on linkedin LinkedIn. Share on reddit Reddit.
Steven Spicer. Photographer, musician, data geek. Usually exploring the line where maths and science meet art and storytelling.
0コメント