Monday, Jan. 16th:
Because of Martin Luther King Jr. Day, our buildings were closed. As a result of this, I was at home making my light in my scene match my shadow better than it was before.
When going back to my lighting, I realized more and more details about my shadow. When I looked at it, the shadow in my scene casted by the gray ball was pretty sharp, which threw me off a little bit because all the other shadows in my scene were faded a little bit, more than the original shadow. From this, I took my sun light spotlight in Maya and started adjusting where the light should be according to my reference photo. I had also went into my spotlight settings and adjusted the roundness of my light being casted to have more of a sharper shadow than before. Eventually I found a sweet spot with it, and the pixel fall off from my render and the reference started to match. This process took some time, but I am glad I did it, because it prepares me for the next step in this process.
Wednesday, Jan. 18th:
On the 18th, after having Martin Luther King Jr. Day, I took further steps in moving my ball through the scene and using the render layers that I have been building since the start of my project.
I got to a point in my project where I started animating my gray ball in the scene to see how it moves in my lighting. This ball animation will be useful in my future when I add my environmental shadows as well. When the animation was done, I took steps ahead to take this Maya project into Nuke for compositing all the layers together.
When in Nuke, I did some changes to the project settings under the color tab. In the color tab, I changed my color management from Nuke to OCIO. After this, I changed the OCIO config to aces_1.1. This color space management helps me keep to true colors from what I see in Maya to Nuke without any issues. After this, I started loading all my rendered layers into separate parts of the nuke tree.
The ball render is going to need a lot of work, because I need to separate the diffuse and specular direct and indirect AOV. Using shuffle nodes and declaring which channel I want to shuffle my RGB values into. After each shuffle node is separated and connected back to the ball render, I use merge nodes to connect all of them together. The operation is set to plus for both merge nodes connecting all four shuffle nodes together. 3 merge plus nodes are used, one for merging diffuse direct and indirect together, one for merging specular direct and indirect together, and one more merging the previous merges together. The result from this gives a faded look to the aplha which was an issue. This was resolved by creating a copy node to copy the alpha from the ball render and copying it onto the final merge plus node so the alpha is full. After putting this all together, I made a separate node tree for all layers that would be in the background of the ball layer, which was the shadow, occlusion layer, and the ground occlusion layer.
Friday, Jan 20th:
On Friday, I was starting to bring more elements of integration into my scene that brought the sense of realism from my object to another level, and this was the introduction on how an environment shadow is made.
Turns out for an environmental shadow, there is some trickery in perspective that is needed to achieve this faux look. One of the first steps to getting to this result is to pitch a camera in the position of the spotlight in your scene. You do this by looking through your selected light, and going in the view tab in your viewer and selecting the option to "Create camera through view". What this will do is create a camera and another (unnecessary) light in your scene. I deleted the extra light added to the scene and looked through the camera of the spot light camera which will be renamed to the "KeyLightCamera". From this perspective, I created another render layer in Maya called "EnvShadowPrep" to create the environment shadow paint for the next render layer. When the image is rendered from this perspective, I open up Photoshop to create my new environment shadow paint. Easily as it is, you just have to select the shadows in Photoshop with the magic wand tool, and fill the shadows with a red value, and the inversed select of the shadows and fill it with a black value. Once the paint layer is made, you can take it back into Maya and use this projection paint to match how the shadows look in your regular scene. I had to pin the projection for this paint to fit the resolution gate because it didn't fit the scene entirely. When the floor is applied with this surface shader, you apply the same surface shader to the ball as well. This creates a warping effect around the ball to make it look like it is adapting the shadows in the scene. When this done, we bring it back into Nuke when all the layers are re-rendered. From this, I open Nuke and read the new environment shadow paint layer. This read is separate from the foreground and background work. From this read, I use a shuffle node to separate the RGB red values to be only an alpha. Now that is a cut out alpha, we can use this alpha as a mask for a color correct under both the diffuse and specular direct shuffle nodes. Adjusting the color correct nodes to mask the newly made alpha, and setting each color correct node's gain to zero, now creates this shadow for the scene. Further adjustments are going to be needed to achieve a better integrated look, but it is a big step into the next part of the project. It all just now needs some more tweaking in Maya with the lights, Photoshop for painting more of the projection, and Nuke to put it all back over each other for the final result.
Sunday, Jan. 22nd:
Over the next two days, I made a lot of progress in both Maya, Photoshop, and Nuke to bring my final result together. I haven't rendered all of the scene yet, only every five frames, but I want to get feedback from my peers on what I can improve on and get their opinions. Linked below is the video of my progress. The final look for this project will be done by Tuesday!
Comments