Home Technical Talk

What exactly is wrong with Blender's normals baking process?

Dashiva
triangle
Offline / Send Message
Dashiva triangle
So I've been teaching myself environmental modeling in Maya, which is a great tool but I still find myself returning to my (admittedly short) roots in Blender for because I'm faster. I've held off using Blender for any serious as far as baking because I've heard on Polycount that there is something wrong with how it bakes with regards to tangent space. I've done a pretty thorough search through the forums but beyond some relatively abstract grumbling nobody's fully elucidated the actual problem. Is there anyone who can tell me what this problem is, and if there's any workaround?

Replies

  • Tiles
    Offline / Send Message
    Tiles greentooth
    Never heard about a problem with normalmap baking in Blender. You may be trapped by a rumour here.
  • JamesWild
    Offline / Send Message
    JamesWild polycounter lvl 8
    I'm of the opinion that the baking in Blender is a bit crude to be honest. It works, and can produce good results, but I don't seem to be able to use a cage to select which geometry to bake so I often have to bake in sections with different settings and put them together in an image editor. I've seen some people make a copy of their low poly and manipulate it as if it were a cage and bake to that, but this will alter the surface normals making baking normal maps using this method a bad idea as it won't match up properly with the low poly.
  • Dashiva
    Offline / Send Message
    Dashiva triangle
    JamesWild wrote: »
    I'm of the opinion that the baking in Blender is a bit crude to be honest. It works, and can produce good results, but I don't seem to be able to use a cage to select which geometry to bake so I often have to bake in sections with different settings and put them together in an image editor. I've seen some people make a copy of their low poly and manipulate it as if it were a cage and bake to that, but this will alter the surface normals making baking normal maps using this method a bad idea as it won't match up properly with the low poly.

    I think this was it, basically. Maybe Sub-D modeling as a whole is a problem, though the subsurf modifier seems to work as it should compared to Maya.

    What I'm trying to do is avoid a time sink where I have problems but don't have the technical knowledge to figure it out.
  • |MM|
    I don't recommend using Blender for baking normals, but when exporting a custom
    cage mesh it works well with Xnormal.
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    The tangent space in Blender is the same one that xNormal uses and actually has some pretty significant advantages over the tangent spaces in Maya and Max, which you can read about here. So there is noting wrong with Blenders baking in that regard ;)

    All in all, Blenders baking is pretty good in terms of workflow, it's easy to learn and it can bake a ton of different map types, but there are a few serious issues that need to be rectified before I would consider it a professional level baking tool set that rivals other baking toolsets in xNormal/Maya and Max etc.
    • Blender uses a uniform ray distance rather than a projection cage, which means it's impossible to bake smoothing groups internally due to ray misses and can make baking objects that are close together, such as fingers, very hard to do without getting projection errors etc. This is the main issue is that Blenders baking toolset has and is a very serious limitation.
    • Blender doesn't have an option to bake with Anti-aliasing, so you have to bake at double res. and then resize down to get 4xAA in the bakes.
    • Blenders AO baking is extremely slow and it can take hours to bake a 2k map in some cases. This is compounded because you have to bake at double res. to get 4xAA.
    I do 99% of all my baking in xNormal (with a custom cage created in Blender) because it is super fast (CUDA ftw!) and produces excellent quality bakes. xNormal can also handle extremely high polygon counts too.

    When baking in xNormal there are a few issues when using meshes that have smoothing groups/hard edges that have been created within Blender. However, they are are a pain in the ass, but not unworkable. :)

    Blenders implementation of smoothing groups/hard edges actually split the geometry rather than the vertex normals (via the EdgeSplit modifier) and although this novelty does not produce erroneous results when a mesh is viewed in game, it can cause a few headaches when baking.
    • You can't use xNormals inbuilt cage functionality when using these sorts of meshes because as the cage size is increased, the split geometry causes gaps along the cage where the smoothing groups/hard edges are, which in turn causes ray misses and gaps in the bake that renders this functionality pretty useless to the point where you might as well be using a uniform ray distance and no cage at all.
    • For the same reason, when using a custom cage you can often get an error in xNormal stating that the cages topology or vert. count doesn't match the LP mesh to which you are trying to bake to and won't let you continue.
    So what do we do when we want to bake meshes with smoothing groups/hard edges? We make a cage in Blender and ensure that the topolgy/vert. count remains exactly the same as the LP mesh. :)

    For the custom cage, I use this method.
    1. Set up your smoothing groups/hard edges in Blender and add the EdgeSplit modifier to the modifier stack.
    2. Triangulate your LP mesh.
    3. Make a duplicate of your LP mesh (without the Edge Split modifier applied, though still active in the stack)
    4. Add a solidify modifier to it and move it above the edge split modifier in the stack.
    5. Uncheck "Fill Rim" and check "Even Thickness" and "High Quality Normals"
    6. Change the "Thickness" into a minus number, so that it covers your HP mesh entirely
    7. Apply the Solidify modifier
    8. Go into Edit mode and select one of the faces of the outer most mesh.
    9. Invert the selection and delete the inner mesh.
    10. Go back into Object mode and apply the Edge split modifier.
    11. Export the mesh with UV's and Normals.
    12. Done!

    That should export a perfect cage mesh, with all the smoothing groups intact and you should now get a perfect bake.

    Sorry for the long post, but hopefully that will clear some stuff up :)
  • JamesWild
    Offline / Send Message
    JamesWild polycounter lvl 8
    +1 on it being a bit slow as well. I'm currently using Blender in a diffuse-only pipeline and it's not terribly quick. Reflective surfaces in particular are killer. However, it's best I find with baking in Blender to do a 2-8x oversize map and downscale as you said to get AA, but also take that into account with your materials. There's no point in doing an 18-sample diffused reflection (the default) because at 8x oversize that's 1152 reflection samples PER PIXEL of your final map. You can also drop the settings in the environment panel because they don't make much difference either.

    Here, a pretty ugly material was 2x multisampled down to something decent looking in the final scene.
    43epQ.png

    If there's one other thing I've noticed, in "full render" mode, specular colour seems to have little to zero impact, thus why I'm not using the correct diffuse/specular combination to render gold.

    This is my first time using Blender for an entire project as I'm trying to push out a small scale game over the summer and don't have the cash for any expensive tools right now.
  • SnowInChina
    Offline / Send Message
    SnowInChina interpolator
    oh, i know when i started baking stuff in blender.

    it was so frustrating
    i really hope the implement cages in blender sometime soon
  • Dashiva
    Offline / Send Message
    Dashiva triangle
    The tangent space in Blender is the same one that xNormal uses and actually has some pretty significant advantages over the tangent spaces in Maya and Max, which you can read about here. So there is noting wrong with Blenders baking in that regard ;)

    All in all, Blenders baking is pretty good in terms of workflow, it's easy to learn and it can bake a ton of different map types, but there are a few serious issues that need to be rectified before I would consider it a professional level baking tool set that rivals other baking toolsets in xNormal/Maya and Max etc.
    • Blender uses a uniform ray distance rather than a projection cage, which means it's impossible to bake smoothing groups internally due to ray misses and can make baking objects that are close together, such as fingers, very hard to do without getting projection errors etc. This is the main issue is that Blenders baking toolset has and is a very serious limitation.
    • Blender doesn't have an option to bake with Anti-aliasing, so you have to bake at double res. and then resize down to get 4xAA in the bakes.
    • Blenders AO baking is extremely slow and it can take hours to bake a 2k map in some cases. This is compounded because you have to bake at double res. to get 4xAA.
    I do 99% of all my baking in xNormal (with a custom cage created in Blender) because it is super fast (CUDA ftw!) and produces excellent quality bakes. xNormal can also handle extremely high polygon counts too.

    When baking in xNormal there are a few issues when using meshes that have smoothing groups/hard edges that have been created within Blender. However, they are are a pain in the ass, but not unworkable. :)

    Blenders implementation of smoothing groups/hard edges actually split the geometry rather than the vertex normals (via the EdgeSplit modifier) and although this novelty does not produce erroneous results when a mesh is viewed in game, it can cause a few headaches when baking.
    • You can't use xNormals inbuilt cage functionality when using these sorts of meshes because as the cage size is increased, the split geometry causes gaps along the cage where the smoothing groups/hard edges are, which in turn causes ray misses and gaps in the bake that renders this functionality pretty useless to the point where you might as well be using a uniform ray distance and no cage at all.
    • For the same reason, when using a custom cage you can often get an error in xNormal stating that the cages topology or vert. count doesn't match the LP mesh to which you are trying to bake to and won't let you continue.
    So what do we do when we want to bake meshes with smoothing groups/hard edges? We make a cage in Blender and ensure that the topolgy/vert. count remains exactly the same as the LP mesh. :)

    For the custom cage, I use this method.
    1. Set up your smoothing groups/hard edges in Blender and add the EdgeSplit modifier to the modifier stack.
    2. Triangulate your LP mesh.
    3. Make a duplicate of your LP mesh (without the Edge Split modifier applied, though still active in the stack)
    4. Add a solidify modifier to it and move it above the edge split modifier in the stack.
    5. Uncheck "Fill Rim" and check "Even Thickness" and "High Quality Normals"
    6. Change the "Thickness" into a minus number, so that it covers your HP mesh entirely
    7. Apply the Solidify modifier
    8. Go into Edit mode and select one of the faces of the outer most mesh.
    9. Invert the selection and delete the inner mesh.
    10. Go back into Object mode and apply the Edge split modifier.
    11. Export the mesh with UV's and Normals.
    12. Done!

    That should export a perfect cage mesh, with all the smoothing groups intact and you should now get a perfect bake.

    Sorry for the long post, but hopefully that will clear some stuff up :)

    Excellent post, thanks. I think my default workflow should probably go from Blender -> Xnormal now instead of bothering with Blender's baker.
  • ivanzu
    Offline / Send Message
    ivanzu polycounter lvl 10
    JamesWild wrote: »
    +1 on it being a bit slow as well. I'm currently using Blender in a diffuse-only pipeline and it's not terribly quick. Reflective surfaces in particular are killer. However, it's best I find with baking in Blender to do a 2-8x oversize map and downscale as you said to get AA, but also take that into account with your materials. There's no point in doing an 18-sample diffused reflection (the default) because at 8x oversize that's 1152 reflection samples PER PIXEL of your final map. You can also drop the settings in the environment panel because they don't make much difference either.

    Here, a pretty ugly material was 2x multisampled down to something decent looking in the final scene.
    43epQ.png

    If there's one other thing I've noticed, in "full render" mode, specular colour seems to have little to zero impact, thus why I'm not using the correct diffuse/specular combination to render gold.

    This is my first time using Blender for an entire project as I'm trying to push out a small scale game over the summer and don't have the cash for any expensive tools right now.

    You can fix the noise with smart blur inside photoshop/gimp.
  • JamesWild
    Offline / Send Message
    JamesWild polycounter lvl 8
    Yeah, though I'm not sure you understand my point; which was that if you're going to downscale the textures anyway to get multisample AA there's little point in having multisampling in your materials which is part of the reason it takes so long to bake.
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    JamesWild wrote: »
    Yeah, though I'm not sure you understand my point; which was that if you're going to downscale the textures anyway to get multisample AA there's little point in having multisampling in your materials which is part of the reason it takes so long to bake.

    The multisampling only works when rendering, not baking. It would be nice to have, but it is not the reason for the slowness
  • JamesWild
    Offline / Send Message
    JamesWild polycounter lvl 8
    It multisamples reflective surfaces and shadows. If it didn't there'd be no difference between 2 reflection samples and 20.

    The only thing it doesn't do is multisampling on triangle edges.
  • JamesWild
    Offline / Send Message
    JamesWild polycounter lvl 8
    For a clearer explanation, here's a basic reflective surface baked with 18 and 2 reflective samples at 512x512, downscaled to 64x64 to get edge antialiasing. (8x oversampling)

    onnjB.png

    The one on the left looks hardly any different, and took 8 times as long to render as the one on the right.

    If you're going to oversample to get edge AA, you might as well reduce your materials' quality to take advantage of that oversampling as well to reduce overall rendering time.

    *I forgot to multiply on the sample counts, it's 4,718,592 vs 524,288, sorry. Ratio is the same.
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    Ok, I think we are talking about different things. The samples in the Mirror properties are cone samples, I was talking about multisampling AA.
    Regarding AA, there is no point ever baking at more than double res. because once you get to 4xAA you start to hit diminishing returns and yea, it makes sense to reduce things that take a long time to render if you are not going to see a hug difference. :)
  • JamesWild
    Offline / Send Message
    JamesWild polycounter lvl 8
    GD095.png
    Finally, here's a test with some AO and some stopwatch times. This is where a big render time difference came into play. I only did a 4x downsample here. On the left, a total of 8,388,608 samples were done while only 2,097,152 were done on the right, but the time difference was over 10x.

    This time it was done in a scene with ~300,000 triangles (lots of subdivided spheres for the rays to hit) on a Phenom II X4 840.

    The number of cone samples is just another form of multisampling, even if it's a little more intelligent than a straight up averaging of fragments. It is, at least, a lot closer to the definition of multisampling than quite a few AA methods I've seen.
  • BliND123
    The tangent space in Blender is the same one that xNormal uses and actually has some pretty significant advantages over the tangent spaces in Maya and Max, which you can read about here. So there is noting wrong with Blenders baking in that regard ;)

    All in all, Blenders baking is pretty good in terms of workflow, it's easy to learn and it can bake a ton of different map types, but there are a few serious issues that need to be rectified before I would consider it a professional level baking tool set that rivals other baking toolsets in xNormal/Maya and Max etc.
    • Blender uses a uniform ray distance rather than a projection cage, which means it's impossible to bake smoothing groups internally due to ray misses and can make baking objects that are close together, such as fingers, very hard to do without getting projection errors etc. This is the main issue is that Blenders baking toolset has and is a very serious limitation.
    • Blender doesn't have an option to bake with Anti-aliasing, so you have to bake at double res. and then resize down to get 4xAA in the bakes.
    • Blenders AO baking is extremely slow and it can take hours to bake a 2k map in some cases. This is compounded because you have to bake at double res. to get 4xAA.
    I do 99% of all my baking in xNormal (with a custom cage created in Blender) because it is super fast (CUDA ftw!) and produces excellent quality bakes. xNormal can also handle extremely high polygon counts too.

    When baking in xNormal there are a few issues when using meshes that have smoothing groups/hard edges that have been created within Blender. However, they are are a pain in the ass, but not unworkable. :)

    Blenders implementation of smoothing groups/hard edges actually split the geometry rather than the vertex normals (via the EdgeSplit modifier) and although this novelty does not produce erroneous results when a mesh is viewed in game, it can cause a few headaches when baking.
    • You can't use xNormals inbuilt cage functionality when using these sorts of meshes because as the cage size is increased, the split geometry causes gaps along the cage where the smoothing groups/hard edges are, which in turn causes ray misses and gaps in the bake that renders this functionality pretty useless to the point where you might as well be using a uniform ray distance and no cage at all.
    • For the same reason, when using a custom cage you can often get an error in xNormal stating that the cages topology or vert. count doesn't match the LP mesh to which you are trying to bake to and won't let you continue.
    So what do we do when we want to bake meshes with smoothing groups/hard edges? We make a cage in Blender and ensure that the topolgy/vert. count remains exactly the same as the LP mesh. :)

    For the custom cage, I use this method.
    1. Set up your smoothing groups/hard edges in Blender and add the EdgeSplit modifier to the modifier stack.
    2. Triangulate your LP mesh.
    3. Make a duplicate of your LP mesh (without the Edge Split modifier applied, though still active in the stack)
    4. Add a solidify modifier to it and move it above the edge split modifier in the stack.
    5. Uncheck "Fill Rim" and check "Even Thickness" and "High Quality Normals"
    6. Change the "Thickness" into a minus number, so that it covers your HP mesh entirely
    7. Apply the Solidify modifier
    8. Go into Edit mode and select one of the faces of the outer most mesh.
    9. Invert the selection and delete the inner mesh.
    10. Go back into Object mode and apply the Edge split modifier.
    11. Export the mesh with UV's and Normals.
    12. Done!

    That should export a perfect cage mesh, with all the smoothing groups intact and you should now get a perfect bake.

    Sorry for the long post, but hopefully that will clear some stuff up :)

    Having trouble with this, I'm trying to make my cage but scaling in Blender makes it think the vertexes have changed and it's a lot I would have to manually move around. So I am trying to do this Solidify method but not understanding step 8-9. I select 1 outer face and inverse that, it does select the inside but it's still the outside selected besides just the 1 face I had first selected. I'm missing something here, how do I unselect the whole outside?

    Edit: Finally was able to make my cage but a different way. All I did was dupe my mesh, scaled it up and moved any vertexes I needed to and when saving made sure Smoothing and Apply Modifiers was turned off on both the cage and the actual mesh and it worked. I tested it a few times to make sure it wasn't a fluke and it's working consistently.
  • BenLind
    BliND123 wrote: »
    Having trouble with this, I'm trying to make my cage but scaling in Blender makes it think the vertexes have changed and it's a lot I would have to manually move around. So I am trying to do this Solidify method but not understanding step 8-9. I select 1 outer face and inverse that, it does select the inside but it's still the outside selected besides just the 1 face I had first selected. I'm missing something here, how do I unselect the whole outside?

    Edit: Finally was able to make my cage but a different way. All I did was dupe my mesh, scaled it up and moved any vertexes I needed to and when saving made sure Smoothing and Apply Modifiers was turned off on both the cage and the actual mesh and it worked. I tested it a few times to make sure it wasn't a fluke and it's working consistently.


    Sorry for bringing up this old thread but thought someone else might have been confused by the steps. The reason you were confused was that I think metalliandy forgot to add that you need to press "L" to select linked which will select the whole outer part. After that you invert and delete.
Sign In or Register to comment.