i think i've gotten it working with a scene i'd made a while back but i'm going to have to verify.Has anyone figured out how to get the render plugin to work with reflective surfaces in the scene? When you have a reflective surface (like a mirror for example) in the scene it breaks the render plugin, you just get blank frames. I would love to get this resolved some how so I can record 4k 60fps flawless videos with reflective surfaces. (For example, doing a girl doggy style with a mirror in front of her so in the video you can see penetration and her face/rack at the same time.) Has anyone found a way?
Please do when you get a chance. If you're able to pull it off, please teach me howi think i've gotten it working with a scene i'd made a while back but i'm going to have to verify.
I would assume something with render queue but I don't know how/which object, etc.i think i've gotten it working with a scene i'd made a while back but i'm going to have to verify.
With VaM reflective surfaces, such as reflective slate floor or glass? If so, how do you get the render plugin to not just render empty frames?I've used this to record scenes with reflective surfaces often and never had a problem.
Ok, i had to reload assets to test on this new install but YES, reflections no problem with EOSIN.Please do when you get a chance. If you're able to pull it off, please teach me how![]()
Ok, interesting... I have it attached to the 'windows camera' atom, and if the camera is on then the reflection isn't captured, but if I turn windows camera off it is.... In any case, awesome, THANK YOU!![]()
the best answer is to try it yourself. Technically JPG is lossy, but by the time I convert it to H265 or something else in a size that isn't excessive, I don't feel it makes much of a difference.I had in the past and it seemed like the visual quality was much lower. Are you able to get the same visual quality level with .jpg? I may try another render using .jpg because it would save me a lot of time and disk space![]()
Thanks for the suggestion, I'll give it a shot again and see how it goes.the best answer is to try it yourself. Technically JPG is lossy, but by the time I convert it to H265 or something else in a size that isn't excessive, I don't feel it makes much of a difference.
Here's a couple of JPG frames @ 4k
View attachment 456824
View attachment 456823
Sorry one more additional question. I'm getting a weird 'flickering' affect with the lights in the scene. I only seem to be getting this when using transparency in the scene. Have you ever seen that?the best answer is to try it yourself. Technically JPG is lossy, but by the time I convert it to H265 or something else in a size that isn't excessive, I don't feel it makes much of a difference.
Here's a couple of JPG frames @ 4k
View attachment 456824
View attachment 456823
Sorry one more additional question. I'm getting a weird 'flickering' affect with the lights in the scene. I only seem to be getting this when using transparency in the scene. Have you ever seen that?
I had in the past and it seemed like the visual quality was much lower. Are you able to get the same visual quality level with .jpg? I may try another render using .jpg because it would save me a lot of time and disk space![]()
I did not realize this! Thank you!Just in case you didn't see it and might be interested: With my modified version of the renderer sending frames directly to FFmpeg for video encoding (linked and described earlier in this thread), you can save even more time and especially disk space without any loss in quality. If you use the streaming feature, it sends uncompressed frame data to FFmpeg (the JPEG/PNG setting is ignored). The only quality loss you get is in the final video encoding itself, so it will produce the same results as the original version with PNG, but faster and using less disk space: Skipping the JPEG/PNG encoding makes it faster, and it only takes as much disk space as the final video file (as opposed to hundreds of gigabytes) since it doesn't save individual frame images to disk.
Bash:ffmpeg -y -f rawvideo -pix_fmt rgb24 -s [B][U][SIZE=6][COLOR=rgb(184, 49, 47)]3840x1920[/COLOR][/SIZE][/U][/B] -r 60 -i tcp://127.0.0.1:54341?listen -vf vflip -c:v libx265 -preset medium -pix_fmt yuv420p -crf 20 video.mp4
- "-y" is for automatically overwriting the video file if it already exists
- "-f rawvideo" tells FFmpeg to expect raw image data for the frames, instead of something like JPEG or PNG.
- "-pix_fmt rgb24" sets the pixel format. If you record video with transparency preserved, you have to use "-pix_fmt argb" instead.
- "-s 3840x1920" sets the width and height of the frame images (4K in this case)
- "-s 60" sets the framerate.
- "-i tcp://..." makes FFmpeg read the individual frames from a TCP socket. FFmpeg acts as the server, and the VaM plugin connects to it as a client.
- "-vf vflip" mirrors the images vertically. Without it, the video will be upside-down. This is probably due to the pixel order in Unity textures.
- The other options just describe how to encode the output video. You can choose whatever options you want here.
I've re-downloaded the .var for the plugin and I can't find a 'stream' option in the UI. What am I missing?I've just uploaded a first version with the streaming support I talked about. You can get the VAR file from here. It works for me, but note that it's experimental again. Would be cool to see what results other people get. It requires a bit of setup (because it's not possible to run FFmpeg directly from VaM. The user must do that manually), but once you get used to it, it should be very simple.
I tried adding the audio to the rendered video without having to do extra steps after recording, but nothing I tried worked. Maybe someone with more FFmpeg knowledge has an idea, but for now I'm giving up on that.
Here's a (somewhat) quick guide:
First of all, you need a command-line version of FFmpeg (I use version 6.1.1. No clue about other versions) installed.
In the plugin's VaM UI, set the "Stream Mode" (very far down on the right) to "Stream". "Host" and "Port" can be left at their defaults. Now before you start the video recording in VaM, you have to run FFmpeg in a console with some specific options. Here's what I use for 4K video without transparency at 60FPS:
Bash:ffmpeg -y -f rawvideo -pix_fmt rgb24 -s 3840x1920 -r 60 -i tcp://127.0.0.1:54341?listen -vf vflip -c:v libx265 -preset medium -pix_fmt yuv420p -crf 20 video.mp4
Make sure to read the description of the most important options below. The options for the input format must match the ones you use in the plugin, otherwise this will not work.
Once you started FFmpeg with these special options, you can start the video recording in VaM, which will send the raw frames to FFmpeg live instead of writing images to disk. After recording is finished, FFmpeg should finish too. You should then have a fully rendered video file.
The video file will not have audio yet. The audio file is still in the same place that it always was for the plugin. You have to merge the video and audio manually. This should be possible in any good video editor like Avidemux or LosslessCut. I use FFmpeg for it as well, with a command like this:
Bash:ffmpeg -i video.mp4 -i audio.wav -c copy -c:a libmp3lame -b:a 256K video-final.mp4
This assumes that the audio file was renamed to "audio.wav" and put into the video file's directory. A simple script could do both of these FFmpeg commands one after the other automatically, so that it doesn't feel like two steps anymore.
Here's a description of the important options for the first FFmpeg command above. Note that their order is sometimes important:
- "-y" is for automatically overwriting the video file if it already exists
- "-f rawvideo" tells FFmpeg to expect raw image data for the frames, instead of something like JPEG or PNG.
- "-pix_fmt rgb24" sets the pixel format. If you record video with transparency preserved, you have to use "-pix_fmt argb" instead.
- "-s 3840x1920" sets the width and height of the frame images (4K in this case)
- "-s 60" sets the framerate.
- "-i tcp://..." makes FFmpeg read the individual frames from a TCP socket. FFmpeg acts as the server, and the VaM plugin connects to it as a client.
- "-vf vflip" mirrors the images vertically. Without it, the video will be upside-down. This is probably due to the pixel order in Unity textures.
- The other options just describe how to encode the output video. You can choose whatever options you want here.
it should be at the very bottom, i tested it out and so far seems great!I've re-downloaded the .var for the plugin and I can't find a 'stream' option in the UI. What am I missing?
I've tried it, I don't see anything about stream. I'll try again I guess, don't know why it would be any different. Maybe delete the .var completely and redownload it?it should be at the very bottom, i tested it out and so far seems great!
Only question i have is since i've never used FFMPEG from command line, what quality level corresponds to the H265 quantisizer of 21?