• Hi Guest!

    Please be aware that we have released another critical security patch for VaM. We strongly recommend updating to version 1.22.0.12 using the VaM_Updater found in your installation folder.

    Details about the security patch can be found here.

Question AI in VAM? DeepSeek

Battletoad08

Member
Messages
124
Reactions
16
Points
18
Is there a good free version of AI plugin in VAM yet. Most AI plugins for like skyrim you have to pay for an AI service. Deepseek is free.
 
Pretty much all AI services are paid (serious services with actual "low" latency and features). It's highly unlikely you will find something free.
You can take a look at Voxta, but it's paid... unless you install it on your machine. I think if I'm not mistaken. @Acid Bubbles can give you more information.
 
Upvote 0
It may be worth it to clarify things about AI in general.

There are open source large language models such as MythoMax, Llama, Mistral, etc. and closed source models like Claude and GPT-4o.

Then there are hosts, you can host locally MythoMax on a typical medium range GPU, larger models like Mistral NeMo may need a beefier GPU, or you can use a cloud service to host your model. Closed models you must use the model's creators product.

Then you can't do much with a large language model, you need to integrate it with the software, as well as add text to speech and speech to text, a way to call actions and react to things (this all requires additional models).

So, DeepSeek isn't more "free" than other open models before, it really doesn't change anything for you (it's slightly better at reasoning and made a splash because of their low training cost claims, but for you and me it doesn't change things that much - you can still host this locally or use DeepSeek's hosted service).

Now about Voxta, since that's me really wanting to use those models in VaM, I can maybe share a bit about it :D What we did is create something that you can use to run local models AND closed models that works with VaM out of the box. Because the vast majority of people don't have two or three GeForce 4090 in their living room, we also created a cloud platform that allows you to connect to several backends for some stuff, and use local models for other things so it's flexible. It's been two years approximately that I'm working on this so I can say, you may be able to make a quick prototype but it's a lot of work to make something that works well :D

Voxta Server is something you install on your machine, and as of now (may change in the future if we can figure out a way to make the project sustainable) you need to get it from Patreon. We also give you some credits for our cloud host but you don't have to use it. The VaM plugin is free. So, once you have downloaded the server you don't need to subscribe or anything, you can use it for free as long as you host your own models and have the GPU to do so. You may be interested in the updates I keep releasing but you don't have to get them if you don't want the new features and improvements.

With higher tiers you get our VaM scenes as well as a bunch of credits that you can spend on different services to speech and language models, but again it's really up to you and not really related to open source models. I'm saying this to insist of the point that the subscription is a way to have people working on keeping up with it for you, not because you have to use a cloud service to use it.
 
Upvote 0
Back
Top Bottom