Extract everything from your .var files and save thm all  separately

Other Extract everything from your .var files and save thm all separately

ruinousdeity

Member
Messages
14
Reactions
27
Points
13
ruinousdeity submitted a new resource:

Extract everything from your .var files and save thm all separately - Extract everything from your .var files and save thm all separately just like the title

Hello everyone, I am a complete beginner to VAM and PYTHON, with "Python-related software installation up to now no more than 48 hours". I referred to the character appearance extraction program at https://hub.virtamate.com/resources...utomatic-extraction-of-various-presets.25763/ and the appearance extraction script content at https://hub.virtamate.com/resources/appearance-preset-extractor.29467/. I tried to create this script with the...

Read more about this resource...
 
I don't understand, why do you want to extract VARs?
What do you hope to achieve?
 
Last edited:
ruinousdeity updated Extract everything from your .var files and save thm all separately with a new update entry:

file names fix

Now, the script should be able to correctly save the files in the specified folder and use only the file name, not the entire path, as the output file name.
Note: YOUR\\VaMX\\AddonPackages needs to be modified to your actual directory location, with \\ being the necessary path syntax. YOUR\\\VaMX\\EXTRACTATOMS needs to be modified to your actual directory location, with \\ being the necessary path syntax.

Read the rest of this update entry...
 
I don't understand, why do you want to extract VARs?
What do you hope to achieve?
In this thread

The administrator said
'In general, VaM performance is not dependent on the number of vars. VaM performance is, however, dependent on the number of morphs if you get over a certain threshold. Why?

VaM runs a separate thread for each Person atom that does the following:

  • calculates bone changes
  • applies morphs
  • merges mesh (base + grafts)
  • recalcs skin collider sizes as needed due to morphs
  • soft-body calcs
  • skins mesh
  • more skin collider calcs
  • applies soft-body mesh changes
All of that generally runs while the main Unity thread is off doing other things, and completes before the main thread needs the output data. This is measured in 2 places. The FIXED Thread Wait, and the UPDATE Thread Wait. Now if you have a really good processor and GPU and are running in the hundreds of FPS range, you could find these Wait times to be non-zero. The entire thread time is measured by THREAD. Ultimately your FPS could be limited by the main thread or one of these side threads.

Everything on the side thread is not dependent on the number of vars, except for "applies morphs". That is dependent on the number of morphs. Due to complexities with formula morphs (morph controlled morphs - MCMs) and other factors, the "applies morphs" system in 1.X iterates over every possible morph to see if it is has changed compared to the last frame. This is normally fine with a reasonable number of morphs, but for those with 10000s of morphs this can become the critical factor and results in this side thread taking more time than the main thread and ultimately limiting your FPS. If I could easily fix it to a demand or event driven morph change system, I would, but as mentioned it is a bit more complex under the hood so this is not an easy change. Please note that even if you have the morphs set to not preload for a specific var file, they are still scanned and considered because a scene or trigger could activate them. So they are still part of the list of morphs that is iterated on. The preload off just prevents them from appearing in the UI.

My only suggestion at this time is to cull the number of morphs you have in your system by disabling or removing VAR files that have a lot of morphs and you are not regularly using. '

So, I believe that meticulous management of objects in .var files, storing them in the correct directory structure for VAM, and including the removal and streamlining of actually duplicated deformations, object materials, and scripts is necessary. VAR is convenient for downloading and uploading, but it's not as efficient for operation. At the same time, after using 7-zip to decompress and check, it was found that out of 14,000 .var files, actually 300 of them had errors in their content paths and file headers.
 
This is only a problem when there's a lot of VAR with morphs using "preload morphs": true. This you can either change in the VAR or inside VAM in the Package Manager.
You can find those relatively easy and disable the ones that shouldn't preload.

There's a lot of differences in VARs, they go from excellent packaging practices to absolute vomit level packaging care. The best thing you can do is first of all to not use bad packaged VARs, then not keep everything you download, then other things. Extracting VARs is not a procedure that I can see helping you without causing much bigger problems.
 
Last edited:
There was a time quite awhile back before python was a thing when I would've been taking initiative on my own to solve a problem such as this but unfortunately I've fallen so far behind where the tech is that my brain literally clogs and won't push through any difficulty I would be trying to solve so it's really good to see someone attempting to tackle (in any way they can) similar issues I have with software, programs etc. The VaM community would benefit vastly if there was more efficiency with how VaM attempts to load and process everything. I have so much damn bloat in my file structure it's shameful but since much has been purchased content that I can no longer support the creators it's from I'm reluctant really to go through it all and trim the fat. Basically I'm tryin to say that your time and efforts are appreciated.....=)
 
I don't think this is a good idea, even though I have considered it. The first and most important point is that it will disrupt the entire dependency system of the VAM community. Once you unpack all the VAR files, any work you create in the future will only be visible to you. If you want others to see it, you will have to spend more effort to retrieve the VAR files you downloaded initially, and including someone else's data in your own VAR file is not advisable.

Moreover, this is an irreversible operation. If you want to reduce file size, you must delete old files.

As for the second point, if you only want to make the system run faster, I think using VAR Manager would be better.

You can keep only the VAR files and dependencies that you want to use in the current system and temporarily move the others elsewhere."
 
Would it be possible to have the script automatically repack each individual resource back into var format?
 
I don't think this is a good idea, even though I have considered it. The first and most important point is that it will disrupt the entire dependency system of the VAM community. Once you unpack all the VAR files, any work you create in the future will only be visible to you. If you want others to see it, you will have to spend more effort to retrieve the VAR files you downloaded initially, and including someone else's data in your own VAR file is not advisable.

Moreover, this is an irreversible operation. If you want to reduce file size, you must delete old files.

As for the second point, if you only want to make the system run faster, I think using VAR Manager would be better.

You can keep only the VAR files and dependencies that you want to use in the current system and temporarily move the others elsewhere."


This is only a problem when there's a lot of VAR with morphs using "preload morphs": true. This you can either change in the VAR or inside VAM in the Package Manager.
You can find those relatively easy and disable the ones that shouldn't preload.

There's a lot of differences in VARs, they go from excellent packaging practices to absolute vomit level packaging care. The best thing you can do is first of all to not use bad packaged VARs, then not keep everything you download, then other things. Extracting VARs is not a procedure that I can see helping you without causing much bigger problems.
I'm sorry, but I can't agree with your opinion.?

"When analyzing the contents of .var files, I found that many .var contents are not generated with VAM's packager but rather with various compression software. Although the file extensions of these files are all .var, the actual compression format is not necessarily zip, but may include formats like rar and 7z.
At the same time, .var files, aside from using non-standard characters in non-English names that make it impossible for VAM's downloader to recognize these var files, also contain non-standard character paths that render the contents of these .var files unusable in Windows operating systems not using that language. This is particularly true in the case of paid content, such as: person, CUA, or clothing paths that use a variety of different Chinese and Japanese paths.

Although my computer can identify these paths and I can try to modify what happened, in Windows systems not using these languages, VAM cannot normally import these contents with non-standard character file paths.
My script has been handled when dealing with these non-standard characters in .var files, and in principle, all should be checked. However, I still hope that creators can use standard characters in both file names and internal paths of .var files. "

What bigger problems might happen??

Perhaps I need to emphasize a little, in today's times and mentality, we cannot assume that users or downloaders will steal or unauthorisedly use their own creative elements for profit without the user's consent. If such a situation occurs, it is fine to take down the product afterward and reclaim the profits that should have been earned. Anti-theft measures that presuppose buyers' infringement always result in lower performance or even non-functionality of software or games. This phenomenon often occurs on the STEAM platform. To give an example, consider CAPCOM and some Korean developed anti-theft software. It's just a means of "punishing legitimate users without reason". In my region, there's a meme: "Are you also a victim of using genuine software?"

In fact, .var files are not a form of encrypted file format, they are a type of zip. And the fact that meta.json files cannot be fixed and individually saved is the most likely background factor for the creators' content to be unintentionally or deliberately misused without authorization. As a matter of fact, not all creators will upload a dependency.txt at the same time when they upload data. And large scene creators, such as VH34's free and paid content, do not use packager to generate .var files, but use zip to download and overwrite to ensure correct execution.
Screenshot - 2023_5_17 , 上午 01_36_14.png


Screenshot - 2023_5_17 , 上午 01_37_53.png


Not only the above reasons


The .var file is a good download source and a local database or verification provider for accuracy or authorization, but it is not suitable for operations that require the scene to be opened from inside every time it starts, or for monitoring changes in the addonpackages.
Including paid content, I have over 20,000 .var files, totaling about 600GB. All .var files have an average compression rate of 60-75% of the original file size. In other words, if they were all decompressed, even if I loosened the conditions and divided these .var files by 0.8X, the size after decompression would be at least over 850GB. However, what is the total file size after decompressing these .var files along a fixed path and allowing duplicate content to be overwritten? The answer is roughly 650GB. In other words, these .var files contain a considerable amount of duplicate content or "files that coincidentally use duplicate paths and names".
"Duplicated content or files that use duplicated paths and names" have a very negative impact on the memory management of software, and it's not just a matter of efficiency.


Would it be possible to have the script automatically repack each individual resource back into var format?

No, I would never bring about such a situation.;)
 
I'm sorry, but I can't agree with your opinion.?

Considering that so much was written, perhaps I need to contextualize better for my answer.
For this sentence of yours:
The administrator said
'In general, VaM performance is not dependent on the number of vars. VaM performance is, however, dependent on the number of morphs if you get over a certain threshold. Why?

I wrote the below passage:
This is only a problem when there's a lot of VAR with morphs using "preload morphs": true. This you can either change in the VAR or inside VAM in the Package Manager.
You can find those relatively easy and disable the ones that shouldn't preload.

The other part of my reply is in agreement with your latest reply, saying that people package things in all kinds of ways, ranging from excellent practices to absolutely terrible practices:
There's a lot of differences in VARs, they go from excellent packaging practices to absolute vomit level packaging care. The best thing you can do is first of all to not use bad packaged VARs, then not keep everything you download, then other things. Extracting VARs is not a procedure that I can see helping you without causing much bigger problems.

I say it again, we agree that there's a lot of terribly packaged resources around, unneeded copies of files, strange paths, etc. However, where we disagree is in the method used to overcome these problems, unpacking VARs.

I stand with the sentence I said earlier: "Extracting VARs is not a procedure that I can see helping you without causing much bigger problems.". But maybe there's no better option for you other than the unpacking method, based on the amount and type of content you have. Generally, unpacking is not a good strategy, but it may be the "less bad" option you have available.

I leave you with a few more ideas that may help you make your VaM more performant and neat:
  • Have more than 1 VaM folder to use for other purposes with separated content. For example: a curated VAM with resources you often use + another VAM where you try out new content you're unsure if you want to keep, etc
  • Be more strict on what you download and keep - skip stuff with over 60 dependencies
  • Move to a folder somewhere else VARs that you don't use often, and moving back to the VAM folder only when you want to use them
Maybe a combination of these and your method can bring some control over your VAM folder. It's always going to be a compromise for something, and hopefully people will improve their packaging practices and this problem is reduced over time.
Good luck.
 
Back
Top Bottom