Hello,

Here is the scenario for which I would like to find a solution.

I have very precise 3D models on which I want to apply textures based on 2D image examples given by the user. I'm thinking of something like "point&click" user interaction where the user would click on some image region as a texture exemplar and then on the 3D model, or sub-component thereof, to texture. The "algorithm" would then texture the model with the exemplar.

I'm interested in papers, libraries and applications that can help me implement this texturing operation WITHOUT having the user work with UV-maps and all these complicated modelling concepts. For example, this is the kind of paper/algorithm that could help me but I'm looking for more, with code/library usable commercially if possible.

https://geometry.cs.ucl.ac.uk/projects/2016/texture_transfer/

The method need not be state of the art as the most important aspect is the ease of use for the end user and possibility to eventually automate the operation, maybe by restricting the kind image used (showing only the texture to use) or adding "intelligence".

Other constraints:

  • No 3D model generation at all. The models we have are very precise and came from engineering CAD designs so the geometry must not be modified.
  • No text-to-image methods. Even though it's an interesting research topic now, that's not what I need.
  • The genus (number of holes) of the 3D model is arbitrary but I would be happy to start with a solution for models without holes.

Regards,

Bruno

More Bruno Martin's questions See All
Similar questions and discussions