I've just uploaded a first version with the streaming support I talked about. You can get the VAR file from
here. It works for me, but note that it's experimental again. Would be cool to see what results other people get. It requires a bit of setup (because it's not possible to run FFmpeg directly from VaM. The user must do that manually), but once you get used to it, it should be very simple.
I tried adding the audio to the rendered video without having to do extra steps after recording, but nothing I tried worked. Maybe someone with more FFmpeg knowledge has an idea, but for now I'm giving up on that.
Here's a (somewhat) quick guide:
First of all, you need a command-line version of
FFmpeg (I use version 6.1.1. No clue about other versions) installed.
In the plugin's VaM UI, set the "Stream Mode" (very far down on the right) to "Stream". "Host" and "Port" can be left at their defaults. Now
before you start the video recording in VaM, you have to run FFmpeg in a console with some specific options. Here's what I use for 4K video without transparency at 60FPS:
Bash:
ffmpeg -y -f rawvideo -pix_fmt rgb24 -s 3840x1920 -r 60 -i tcp://127.0.0.1:54341?listen -vf vflip -c:v libx265 -preset medium -pix_fmt yuv420p -crf 20 video.mp4
Make sure to
read the description of the most important options below. The options for the input format must match the ones you use in the plugin,
otherwise this will not work.
Once you started FFmpeg with these special options, you can start the video recording in VaM, which will send the raw frames to FFmpeg live instead of writing images to disk. After recording is finished, FFmpeg should finish too. You should then have a fully rendered video file.
The video file will not have audio yet. The audio file is still in the same place that it always was for the plugin. You have to merge the video and audio manually. This should be possible in any good video editor like Avidemux or LosslessCut. I use FFmpeg for it as well, with a command like this:
Bash:
ffmpeg -i video.mp4 -i audio.wav -c copy -c:a libmp3lame -b:a 256K video-final.mp4
This assumes that the audio file was renamed to "audio.wav" and put into the video file's directory. A simple script could do both of these FFmpeg commands one after the other automatically, so that it doesn't feel like two steps anymore.
Here's a description of the important options for the first FFmpeg command above. Note that their order is sometimes important:
- "-y" is for automatically overwriting the video file if it already exists
- "-f rawvideo" tells FFmpeg to expect raw image data for the frames, instead of something like JPEG or PNG.
- "-pix_fmt rgb24" sets the pixel format. If you record video with transparency preserved, you have to use "-pix_fmt argb" instead.
- "-s 3840x1920" sets the width and height of the frame images (4K in this case)
- "-s 60" sets the framerate.
- "-i tcp://..." makes FFmpeg read the individual frames from a TCP socket. FFmpeg acts as the server, and the VaM plugin connects to it as a client.
- "-vf vflip" mirrors the images vertically. Without it, the video will be upside-down. This is probably due to the pixel order in Unity textures.
- The other options just describe how to encode the output video. You can choose whatever options you want here.