Home Technical Talk

Tiling textures from photogrammetry

armagon
polycounter lvl 11
Offline / Send Message
armagon polycounter lvl 11
If you ever worked with photogrammetry before, could you share your workflow when it comes to creating tiling textures? I'm thinking about capturing some rock cliffs for my terrains but I have no idea on how to make em tile.

Replies

  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    Actually I think there is just no good workflow. There is a huge lacuna in modern soft for that specific task. And honestly I don't understand why. I am waiting many years already.

    You basically need a composing soft like Fusion but being rather texture focused.

    Photoshop works, especially with current linked smart objects, but getting depth combine is a torture there. Zbrush 2,5d layers works too but inconvenient like a hell due to 100% instantly destructive nature and you would want to re-compose a lot for tileable textures.

    My own hope is on modern vector soft like Affinity . It looks like having all necessary blending modes to do Z(depth) combine at least. Unfortunately I have no Mac so can't try it right now.


    Somebody, please, write a soft with vector nondestructive bitmap layers having depth and roughness info + something like Zcombine node in Blender with proper depth combine and anti-aliased mask for other channels + SSAO + realtime lighting/shadows Zbrush does + passes output. Cheaper than Nuke.
  • Eric Chadwick
    Options
    Offline / Send Message
    I wonder if Substance Designer might help with this.
  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    Substance Designer is built around producing square shaped doc, pretty inflexible for tile-able textures beyond cell phone games + unique UV unwrap, also useless for non unique textures.

    It could be helpful if they ad a few extra options. It's not a composer really. You would need to do a lot of redundant routine operations there comparing with Fusion for example or even Photoshop. A huge mess of nodes which would be hard to make reusable.

    Both in Fusion and Photoshop you have a gizmo with usable center of transforms at least and don't need to redo all your tree after changing scale factor. Have normal crop, one click feathered edges and so on.

    Actually I have no idea how to recreate Blender "Zcombine" node or maybe Fusion "merge" node with Substance ones. If somebody have would be appreciative to see
  • Froyok
    Options
    Offline / Send Message
    Froyok greentooth
    gnoop wrote: »
    Substance Designer is built around producing square shaped doc, pretty inflexible for tile-able textures beyond cell phone games + unique UV unwrap, also useless for non unique textures.

    It could be helpful if they ad a few extra options. It's not a composer really. You would need to do a lot of redundant routine operations there comparing with Fusion for example or even Photoshop. A huge mess of nodes which would be hard to make reusable.

    Both in Fusion and Photoshop you have a gizmo with usable center of transforms at least and don't need to redo all your tree after changing scale factor. Have normal crop, one click feathered edges and so on.

    We use Designer internally to tile scanned materials (that obviously don't tile by default) so I would say it's totally doable. Regarding the "huge mess" for nodes it's very subjective as there are many way to organize a graph.

    By creating a custom filter with a good set of parameters exposed you could totally tile a lot of different materials from the same basic filter.
  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    I think a key feature to work wit scanned materials is ability to combine patches respecting its depth.

    Could you show an example how could I do so in Designer . I see Max(Lighten) blending mode to combine depth. Ok, now I need a mask for color channel to separate patches based on its depth combine. Preferrably anti-alized one like seems only Blender can do. Is there a way to do so in Substance Designer?
  • karynatrychyk
    Options
    Offline / Send Message
    karynatrychyk polycounter lvl 11
    If it comes to tiling ready textures made from photogrammetry, you can try Bitmap2Material, it's fast and can produce not so bad results.
  • Froyok
    Options
    Offline / Send Message
    Froyok greentooth
    If you have a depth/height map, then using a level should be enough to extract a part of it. You could also easily expose the min and max depth as parameters to use the level as a dynamic filter.

    I don't know blender, but a very soft blur combined with another level node could also work if you need to generate a clean mask. We also have a node that allow to extract a color as a mask (see the node "color to mask").
  • armagon
    Options
    Offline / Send Message
    armagon polycounter lvl 11
    I'm using a normal and diffuse made from photogrammetry, but Bitmap2Material only accepts a diffuse. I don't want to lose the details of my normal. :(

    For context, it's a dirt texture and a concrete one.
  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    I need not just a part but rather a threshold mask where one input higher then another.

    Would the blur trick work in pixels, same way Photoshop does it could be usable. That's why I think Photoshop is capable for the task even needing a huge mess of layers and groups for that.

    In Substance Designer I can only set the blur in some digit value and after re-scale have to fix it everywhere manually.

    Same for transform gizmo. If I need a precise scale I can't do it from a specified point/corner like Photoshop or Fusion. Only from pretty useless point always in the middle.

    So I believe Substance Designer may be good for procedural effects but not for precise composting of patches/objects from photogrammetry with height values
  • karynatrychyk
    Options
    Offline / Send Message
    karynatrychyk polycounter lvl 11
    Sorry, I can't understand from the discussion, are we talking about blending two textures together, or about making a texture tiling? I mean, where is the problem :)
    As for Bitmap2Material, you can put textures in all slots, not only diffuse.
  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    As for Bitmap2Material, you can put textures in all slots, not only diffuse.

    From my limited experience it generates depth automatically from a single input. A photo basically and it's ,to be honest, absolutely useless approach.

    Could be suitable for only tiny hi frequency details.

    Maybe I am missing something so would be appreciative if you tell how you do other inputs there.
  • m4dcow
    Options
    Offline / Send Message
    m4dcow interpolator
    Bitmap2Material is designed to create multiple maps from a source texture, but you just want to make different captured inputs tile the same. The sort of stuff BM2M it uses to make textures tile are probably available in substance designer, and it would be a matter of creating a substance that will accept all the maps you have captured and make them tile correspondingly.
  • mattyinthesun
    Options
    Offline / Send Message
    mattyinthesun polycounter lvl 4
    Personally I wouldn't try to make the normal map tile at all. - I would make a tilable height map and diffuse map, then make the normal map from that - either using Substance, or projecting the height map as a displacement in ZBrush/Mudbox and baking it out as a normal map.
  • armagon
    Options
    Offline / Send Message
    armagon polycounter lvl 11
    That's one of the problems: after displacing it in ZBrush, how would you make it tile? By hand?

    Or, would Substance be better than Agisoft Scanner in creating displacement from the photo source?
  • karynatrychyk
    Options
    Offline / Send Message
    karynatrychyk polycounter lvl 11
    gnoop wrote: »
    From my limited experience it generates depth automatically from a single input. A photo basically and it's ,to be honest, absolutely useless approach.

    Could be suitable for only tiny hi frequency details.

    Maybe I am missing something so would be appreciative if you tell how you do other inputs there.

    You are right. As far as I understand, the workflow is putting photoscanned piece in Zbrush and extracting diffuse, normal and other maps from it. Then if you put all these maps in B2M and use Make it Tile option, you'll get all your maps tiled. Or you can use Substance Designer node for the same result. It won't probably work for something like bricks, but it can do the job for ground textures or concrete.
  • Gazu
    Options
    Offline / Send Message
    Gazu polycounter lvl 12
    Guys i have a Workflow for it, its easy. Only in Photoshop.

    I will try to find the time to make a Tutorial for you @ YouTube.
  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    You are right. As far as I understand, the workflow is putting photoscanned piece in Zbrush and extracting diffuse, normal and other maps from it. Then if you put all these maps in B2M and use Make it Tile option, you'll get all your maps tiled. Or you can use Substance Designer node for the same result. It won't probably work for something like bricks, but it can do the job for ground textures or concrete.

    It works for mostly flat things and hi frequencies small height variations only really.
    You can't just hi-pass and blend height map like a color image without respect to actual height. It would produce very repetitive texture with noticeable height blending seams.

    To do it right you need a proper actual Z(depth) combine and clever separated, placed and matched fragments + resulting cavity map and SSAO.

    The method you described wouldn't work for multiple things, not only bricks. Basically any specific, well distinguished details with specific shape would be messed by such simple depth blending and feathered depth seams.


    Surprisingly I saw proper Zcombine in real time shaders already bot not in any texture making software beyond Fusion or Zbrush 2,5 D layers which are both a pain to make textures in actually.

    I am trying to do so in Photoshop but it requires so huge and complicated stack of layers and blending modes I cant figure out a thing in my own files after couple days.

    Would Blender has standard transform gizmo in its composing transform node I would probably be happy just with it.
  • mikolajspy
    Options
    Offline / Send Message
    mikolajspy polycounter lvl 7
    Hi,
    I'm creating series of photoscanned tiling textures for Unity Asset Store (you can find link in my signature).
    I'm just poor student so here is my cheap workflow without many fancy software:
    I'm doing scan using Agisoft PhotoScan, then I import the output into Blender, set the size of the plane so it covers area I wan't to have on texture. Then I simply bake texture, normal, and heightmap from scan to plane and make them tileable in GIMP :)
    In my opinion results are very nice :)
    Here are some live examples: https://sketchfab.com/mikolajspy/folders/3746e391ee934f4a9315d1b0f67a27f1
  • PennyB
    Options
    Offline / Send Message
    Hello,

    I too am trying to figure out a good work flow for using photogrammetry to make tiling textures.

    I've managed to get a base obj and texture out of the photoscan software and now I'm a bit stuck how to make it tile in Zbrush. What I want to do and what I can achieve seem slightly different. I thought that there might be a way to use the RGBA01 texture that I grabbed from the zbrush doc in the spotlight to paint both RGB and heightmap information at once using the wrap tool under the brush settings to make a tileable texture. but it seems I can paint the height information stored in the Alpha of the texture.

    Am I doing something wrong or is this just not something that Zbrush can do?
    I've read a fair few articles about tiling textures in Zbrush now and someone has recommended substance but all links I have found to that don't really relate to the type of thing I'm trying to do.
    tile_01.jpgtile_02.jpg

    Apologies if this post comes out like gobbledee goop, it's my first time here. :)
  • Gazu
    Options
    Offline / Send Message
    Gazu polycounter lvl 12
    Hey Guys,
    my Tutorial is done.

    You will learn how to create multiple tileable textures at once so that at the end of the tutorial you have tileable textures which fitting together perfectly and which are ready for your favourite game engine.

    [ame]https://www.youtube.com/watch?v=-B-xAhujpuI[/ame]
  • a3sthesia
    Options
    Offline / Send Message
    a3sthesia polycounter lvl 10
    Gazu wrote: »
    Hey Guys,
    my Tutorial is done.

    You will learn how to create multiple tileable textures at once so that at the end of the tutorial you have tileable textures which fitting together perfectly and which are ready for your favourite game engine.

    https://www.youtube.com/watch?v=-B-xAhujpuI

    It would be great to work out a way to keep all the 3D data/information coming out of photoscan, without reducing it to a 2D flat image.

    I wonder if the same workflow used to "clone" the image in photoshop (to remove seems), could be used in ZBrush or Mudbox, to clone 3D sculpted detail, and remove seems...

    A mixture of all those Naughty Dog sculpted tiling texture tutorials floating around, and some of that Bitmap2Material/Substance Designer technical wizardry...?
Sign In or Register to comment.