Okay, that looks fantastic
How? You have any "insights"? Is it speculation or do you have some valid points? I'm just curious.it’s chatgpt that will really revolutionise VaM.
You can't see?How? You have any "insights"? Is it speculation or do you have some valid points? I'm just curious.
I want to test one in the store. My S21 does mediocre with room scans but can't do flat walls cuz it relies on imaging, so th walls seem to bleed in to near by objects, like the wall behind my bed. I'd have to hang posters everywhere for it to work. iPhone is supposedly much better at this, LIDAR ==>
can get very accurate readings.
- a detection system which works on the principle of radar, but uses light from a laser.
ok, a quick google, and nevermind...not going to waste my money this go around, no S23 testing needed, it's the same crap, two cameras, and a PC to gueeeesss at stuff. It'd be quicker to make it in blender and texture with pictures. maybe borrow an i phone just for this.
"Samsung s23 doesn't have any depth camera. You can use it with photogrammetry, but you'll need a PC with a decent GPU to process the scans. "
If we could just grab the AI from another unity game.. that would be awesome. Of course it would have to be 3 laws safe.You can't see?
Its all "on the verge" of being amazing. Look at how well services like Amazon Alexa, Apple Siri etc understand your voice.
Add in the recent amazing advances in generating human like speech and the obvious conversational breakthroughs with ChatGPT and you get... An AI "partner" who can listen, understand and answer back in a rational way.
It's only a lack of data training and the right model that stop AI models from controlling the movement too. I have seen AI trained motion models for game characters and they can walk around, interact with objects, duck under things and step over things etc all without pre-programmed fixed animation in dynamic environments.
Can you imagine a VaM model stood the other side of the room in your scene and you just speak, saying "come here and sit down" and she does. She walks over to you and sits in the chair next to you, or maybe sits on the floor, or your lap, without any pre-programmed animation or even knowing which she will chose.
this is all possible now.
Each part has been done, and works. It has even been done on consumer level hardware.
We are on the verge of getting realistic interactive virtual people. There needs to be a lot of improvement in hardware to get it all running together locally, none of this cloud shit but it's coming.
I will express a couple of my thoughts. VAM is great in its boundless creativity, thanks to the creators of mods, the functionality continues to increase, and this is exactly what makes it so popular. VAM allows you to realize almost any (and if you know blender and unity, then any) fantasies and the only one who can compete with it now is HS2, but there is no physics and so on. I've heard that some VAM users are making a game with similar functionality on UE5, but I can't say anything specific about it
I just saw this post, to be honest, ComeCloseGame is a "Total Joke" compared to VAM. It's like comparing one average lookingJust saw this, and honestly it looks amazing. The cloth physics are outstanding.
None of that nipples poking through the top here!
It's also a quest2 native app (no PC needed!) but apparently the high level physics stuff requires a PC (the cloth). Not sure how it will look on headset only but not needing the PC will be pretty cool.