SSD 1T drive solely for VAM Cache

janet67

Active member
Messages
276
Reactions
91
Points
28
I did not have another PCIe slot available, so I had to go SATA. I had to do something as I kept bottoming out. So, far scenes do seem a little better. Has anyone else gone this route and found good results? I have 32 GB ram, and two empty slots. But of course, my heat sink is covering one of the available slots. I wish more creators, would make all the aesthetics easier to remove. I still run into scenes that I have to go into the JSON file to remove some things. Still, can't complain too much as the creators are awesome and I appreciate their hard work.
 
The cache does not benefit that much from using a SSD or HDD. For the VAM files yes, a SSD is a big jump, but for the cache you could just use a HDD and save the SSD space for better things.
 
I only have the two SSD drives in my PC now. I think from what you say, for me the advantage is that I am not completely running out of space on my main SSD drive and crashing VAM as the VAM cache grows huge. But for sure I will be able to have a much bigger cache now! The drive was only $64 and the cable $4. Still I would think an SSD drive would still be a little better than a HDD for anything. I cannot really run tests, as I have no HDD drive. Appreciate your input though, most this stuff is beyond me. Also, now my main drive, can have a decent cache of its own, as I am sure before, VAMs cache was competing with it. IDK LOL!
 
Yeah, you have some gains. This applies more to those with existing HDDs who don't need to spend money on a SSD solely for the cache.
 
Still I would think an SSD drive would still be a little better than a HDD for anything.

You'd think so, but it turns out not to be the case. From what i can figure out, the CPU overhead when loading a texture means it takes more time than required to read the data from disk. So disk access time isn't what's slowing it down. You can do the math yourself; check the size of the texture file, your disk transfer speed, and the time it takes to load. The disk is delivering the data far faster than Vam is using it.

Vam definitely performs better on an ssd than a hdd, but the cache can be on either.
 
Hey, thanks for the input. I do have VAM on my fastest SSD so I am good there. I believe you, and decline to do the math. I guess my biggest advantage is now having the cache on a separate drive than VAM, which was on my main C: drive. I have to wonder if my new VAM cache drive, will just continue to grow and fill up the entire 1T drive. I have:
AMD Radeon RX 5700 XT 8 gig
Intel(R) Core(TM) i7-10700 CPU @ 2.90GHz 2.90 GHz
RAM 32.0 GB (31.9 GB usable)
certainly not high end by todays standards, but I would think reasonable. Yet of course, VAM can be be a monster.
 
Mine's currently sitting at 62GB. It's been as high as 350 GB. Some people seem to think clearing the cache regularly is a good idea, but I don't buy it. I only clear mine if something gets messed up so textures aren't loading right.

It only increases in size when you load a texture not already in the cache, so after a while it pretty much stops growing, as most everything you usually use is already in it.
 
Had VAM running from a server-grade SATA HDD is the past and did some measurements against SATA SSDs with and without DRAM cache. Measured the VAM load times on a bloated installation. I do not have the numbers anymore, but they ranked like this:
  1. SATA SSD with DRAM cache, fastest but within margin of error of 2), like 1-3 seconds
  2. SATA HDD Seagate Ironwolf Pro 10 TB
  3. SATA SSD no DRAM cache, VAM load time very long!
The measurements on the SATA-SSDs are from multiple drives. Samsung, HP and cheap no-name brands. I have a bunch of them laying around here from my previous IT-job. They where 512GB / 1 TB in size. The one big mayor difference in performance always came from whether they had DRAM cache or not.

I did not test the VAM cache specifically here - instead the entire VAM folder was copied to that drive. Would like to have compared against NVMe too - but I have only one M.2 slot. That's the system drive that I will not touch. To important.

Even tried raid 0 with two known good SATA SSDs - not worth it.

Edit / speculation:
My guess is that we are all using a very suboptimal texture format anyway - which increases especially texture load times unnecessary.
PNGs and JPEGs are stored in such a way that they need to be converted to be usable from VRAM.
There are formats specifically designed to be used on GPUs and Unity 2018 actually supports it but VAM does not.
BC7 for example could be loaded directly. No CPU time spend on converting into the pixelformat needed by the GPU. MipMaps already included. Sure the package-file sizes would be bigger but on modern SSDs (especially NVMe) I doubt this would be a bottleneck.

Unfortunately it's not that simple to "just add" another format to VAM. One has to consider special scenarios. For example when the genital texture is being generated from the torso texture. There are many cases where combining multiple texture "layers" into one final texture would be the optimal way for render-performance / less VRAM consumption - decals for example. I do not know how the VAM caching system works, but if I had to guess it does exactly that - combine multiple textures before they are being shoved into precious VRAM to save memory. I know that's how DAZ Studio does it.
 
Last edited:
Wow, thanks for the technical details. Although my new SSD drive that I am using specifically for a VAM Cache does not have DRAM, my original C: drive with VAM does have a SSD with DRAM.
 
Problem that I am now encountering with with 4,000+ .var files it slows most of the program down and thats with a gen 4 m.2ssd reading at 7500mbps+
 
Right now, I have about 4500 Var files. I started to move some of them to a separate folder on my second SSD. So, I no longer have a dedicated SSD for the cache. Right now, my cache is at 97.3 GB. My goal is to get the majority of my VARS on the second drive, as I have maybe a few hundred Vars that I have used in the last couple months. Makes no sense to have them all on my main drive. There are tons of good ones of course, so occasionally I will look through them.
 
Edit / speculation:
My guess is that we are all using a very suboptimal texture format anyway - which increases especially texture load times unnecessary.
PNGs and JPEGs are stored in such a way that they need to be converted to be usable from VRAM.
There are formats specifically designed to be used on GPUs and Unity 2018 actually supports it but VAM does not.
BC7 for example could be loaded directly. No CPU time spend on converting into the pixelformat needed by the GPU. MipMaps already included. Sure the package-file sizes would be bigger but on modern SSDs (especially NVMe) I doubt this would be a bottleneck.

This is actually exactly what the Vam cache is for. It saves textures in GPU ready format so no CPU time is needed to convert them after loading.

I did some testing when the feature was added and found essentialy no load time difference when the cache was on my M.2 drive or my HHD. My conclusion is that actual data transfer speed is not the bottleneck for texture loading
 
Last edited:
Back
Top Bottom