Hello! Thank you for your support!
I still believe that using AI with VaM will remain a narrow niche. Moreover, there are much more advanced AI plugins (Voxta, Alive, AI Chat Plugin by vamX). I think KoboldLink plugin is more for just getting into the AI topic.
It seems that KoboldCPP got so many updates in 2024 that it is not easy to keep track of everything
. Regarding KoboldCPP launched without a model, I checked the latest release (koboldcpp-1.80.3), but unfortunately I did not find such an option. Can you, please, explain which buttons to press to run KoboldCPP this way? I also want to test it.
If the option you are talking about is only available in the browser interface (KoboldAI Lite) and not in the server settings window, then it is not possible to use it with this plugin, since the plugin can only send requests to the KoboldCPP server and cannot communicate with the browser frontend part.
Currently, in order not to overload the PC running VaM with the AI calculations, I run KoboldCPP with a model on another PC and connect it to the first one either via a local network or via a Cloudflare tunnel (checkbox "Remote tunnel" on "Network" tab in KoboldCPP).