We are excited to announce a new feature on the Hub: Favorites!
You can now add resources to your favorites, and organize your favorites into collections!
You can check out the details in our official announcement!
A semi-formal dress with lace accents. Designed by stable diffusion. Fully simmed with 3 patterns and a 4th preset as an example of how to tint the dress.
The first is how I did with this dress. I'll generate a few hundred photo shoot images where I describe the dress in the prompt.
For example "A studio photo of a woman in a sheer black dress, simple background (a bunch of random bullshit about the type of camera used)". Then I have a few starting models based on neck lines and length that I will cut to form. I don't do alot of the 3d modeling other creators do because I usually do most of the shaping using textures.
Then I'll use the generations I like as a base for img2img generatons to get a more solid idea of what I'm going for. Once I like it I take the model into Substance Painter and do the textures.
The other way I'll use SD is to create tileable textures for different materials. Then pass it through another program to create the height, roughness, and normal maps. Take those maps into Substance Designer, have it do some math and magic to get it looking realistic. Then import the material into Painter to use.
It is still a largely manual process. But it has cut down heavily on how long it takes to come up with an idea. And creating materials has gone from a process that took me 1-3 hours to a matter of maybe 15 minutes if I get a good gen. A rougher gen that is close but not perfect still takes about an hour.
As a creator the big benefit is that its helped my stop creating the same dress over and over. My personal VaM has like 5 or 6 unreleased dress just because they are basically the same as stuff I already made. Takes a lot of the pressure to be creative constantly off of me. Sometimes I don't want to think I just want to paint.
The first is how I did with this dress. I'll generate a few hundred photo shoot images where I describe the dress in the prompt.
For example "A studio photo of a woman in a sheer black dress, simple background (a bunch of random bullshit about the type of camera used)". Then I have a few starting models based on neck lines and length that I will cut to form. I don't do alot of the 3d modeling other creators do because I usually do most of the shaping using textures.
Then I'll use the generations I like as a base for img2img generatons to get a more solid idea of what I'm going for. Once I like it I take the model into Substance Painter and do the textures.
The other way I'll use SD is to create tileable textures for different materials. Then pass it through another program to create the height, roughness, and normal maps. Take those maps into Substance Designer, have it do some math and magic to get it looking realistic. Then import the material into Painter to use.
It is still a largely manual process. But it has cut down heavily on how long it takes to come up with an idea. And creating materials has gone from a process that took me 1-3 hours to a matter of maybe 15 minutes if I get a good gen. A rougher gen that is close but not perfect still takes about an hour.
As a creator the big benefit is that its helped my stop creating the same dress over and over. My personal VaM has like 5 or 6 unreleased dress just because they are basically the same as stuff I already made. Takes a lot of the pressure to be creative constantly off of me. Sometimes I don't want to think I just want to paint.