Unity 6 is very impressive. VAM 2?

lolmao500

New member
Messages
21
Reactions
3
Points
3
Unity just released a video for Unity 6 and its quite impressive.
Unity 6 Time Ghost

One of the new things in Unity 6 is machine learning, using it to create accurate and realistic cloth deformation. They developped Unity Sentis for that, trained the ML model with 2,000 frames of vertex animation. The deformations happen in only 0.8 ms per frame, which is impressive and would be very useful for VAM 2.0.

A big feature is also the new hair system which is very realistic.

Meshed said that VAM 2.0 would use the latest Unity version... I wonder if VAM 2.0 will get a future patch after release to "upgrade" the game to Unity 6 or if its too much work or if its gonna be straight up used for development as soon as its out so vam 2 is unity 6 when it comes out of beta... Only one man can answer these questions :)
 
Environments and effects look great, but the character model rather poor in my opinion. Also, the hair is also has a very artificial tone and movement to it . Issue could possibly just be poor artistry for the showcase.
 
Also, the hair is also has a very artificial tone and movement to it .

Tone, that's a matter of preference. But motion? That goes probably in the most impressive hair sim I've seen in "realtime".


Unity just released a video for Unity 6 and its quite impressive.
Unity 6 Time Ghost

One of the new things [...]

Bear with me: I'm not angry or mad, I'm chill. I'm just gonna be extremely straight forward because this overall trend over tech demos annoys me.

You should never, EVER, think about a project through the lens of stupid tech demos. Yes I said stupid. Time Ghost is beautiful, there's no denying that. But that's not a game, that's at best a realtime cinematic.

Stop simply grabbing and repeating the marketing takes on their communication. UE5 has done an incredible job at this, making the gamers all around the world think it was a magic engine, and now developers and studios are realizing that Lumen/Nanite/whatever are not that cool and as easy to use as they'd like everyone to think. That's not hard: pretty much every UE5 game released in a catastrophic state.

And if you drop "ML" in your marketing campaign, that's like the bingo of video game takes.

Whatever Unity 6 brings to the table. You should always be careful with what they say... they're here to sell an engine first and foremost. There are cool features in Unity 6, but the potential fidelity gap between the previous version and this one is not that crazy, especially in the hands of a talented developer. The most impressive feature of Unity 6 are probably on the performance part (which is something you don't even see "on screen").

VAM 2 being on Unity 6 or not is not gonna make the game 2 times better or worse.

This is all "pre-rendered bullshit" to be honest : )
Even tho it's a technical achievement visually, this is not a game, this is not what you should expect from a game that has physics, gameplay, UI, hundreds of components and so on.

Don't fall for their marketing bullshit, engines are not making games great... studios and developers are.
 
Side note: VAM 2 will be on Unity 6. (but this has nothing to do with that tech demo :p)
 
UE5 has done an incredible job at this, making the gamers all around the world think it was a magic engine, and now developers and studios are realizing that Lumen/Nanite/whatever are not that cool and as easy to use as they'd like everyone to think.
Yeah? Why is that? Got any links to videos or some kind of infos about it?
 
Yeah? Why is that? Got any links to videos or some kind of infos about it?

Simply:
  • Game state at release
  • Player feedback
  • Studios/devs feedback
  • Internal discussions and everyday job experience + friends at other studios who are in the same deep shit trying to make that engine spew more than 30 fps on consoles when you have these features enabled
:3
 
This guy is talking too fast for me. I'm no native speaker, my english is good, but thats exhausting :)
Thanks anyway! Maybe I'll find another one with less words per minute.
 
Put it at 50% speed, no sound, subtitles ;)
You are right, havent even thought about the option of subtitles ... I'm not that old but obviously too old 😂

But, a question before I watch the video: from what I know UE5 has "kind of" a total number of nanites visible in every scene, so to say. Right now even highend systems are struggling with the performance, right? Isn't this just a problem of time? GPUs are advancing and in maybe 5 years the average performance is good enough to handle the amount of nanites? Or isn't this the problem at all?

Short answer if you can, no need for technical explanations cause I'll watch the video anyway :)

edit: I watched half the video and dont understand nothing (content wise). Too much of a tech video for me, sorry.
 
Last edited:
But, a question before I watch the video: from what I know UE5 has "kind of" a total number of nanites visible in every scene, so to say. Right now even highend systems are struggling with the performance, right? Isn't this just a problem of time? GPUs are advancing and in maybe 5 years the average performance is good enough to handle the amount of nanites? Or isn't this the problem at all?

Let's take this the other way: Epic is marketing those fancy features as "console compatible" and production ready. If, your game dies upon using those features that are demoed like being able to run billions of polys... but without being even close to the demos shown. You know there's a problem.

Now, lets say we take Crysis as an example: at the time of its release the game was extremely demanding. But, tweaking the settings allowed an average PC to run it somewhat "properly". Today, you're getting games that have extremely poor and bad performances with the bare minimum settings at release.

If we do some kind of TLDR of the video: Nanite is worse than the usual LOD systems AND Epic tends to drop the support for standard optimization methods. Nanite is great when you don't give a fuck... but at the size of AA or AAA games, it's not good enough to handle as much as they want you to think.

Another way of saying that: if we're in an extreme case scenario (billions of polys), you could have a way better result by optimizing the meshes properly yourself than relying on Nanite. If you prefer: Nanite wastes GPU time. If you can "bruteforce" a scene with a huge GPU like 30xx / 40xx, it only means it would even run faster with proper optimization technics.

Sadly, beyond the nanite and lumen issues. This is also how the industry fixes problems. "We have perfs issues!?" > Flips DLSS/FSR on.

You can draw that conclusion often today: if the game you just bought runs badly, it's 90% of the time not a hardware issue. It's either poor technical choices or lack of optimization (and you can translate "lack of optimization" to "not enough money/time to do the job correctly").
 
Thank you. Yeah that was what I already read multiple times: DLSS is used as an excuse for poorly optimized games.
 
Back
Top Bottom