This is an astonishing piece of code, thank you for this! I've been searching everywhere for solutions to two problems that I'm trying to solve. I'm not yet sure if this will deliver solutions for both issues, but either way, it's a great addition to the options that I have.
I thought maybe this is a good place to ask you guys if you have any creative ideas regarding my use case.
My situation is as follows: I have a Magewell HDMI capture card and a Logitech Brio webcam under Arch Linux which I use for recording Let's Plays (sometimes I also use a second Magewell and another Logitech Cam, but let's restrict ourselves to the simpler case). Up until now, I've used two ffmpeg instances to record both V4L2 input devices (plus audio stuff) to disk for later editing. This worked great so far, with NVENC for fast encoding, and smooth 60 FPS for both the cam and game capture.
The problems began with me getting more into streaming. In principle, OBS is great for that, however, I have two problems:
1. I tried (almost?) everything to get the Logitech Brio webcam to give me smooth 60 FPS playback in OBS. The cam parameters are (I think) carefully tuned using qv4l2, I always reduce latency using the /dev/cpu_dma_latency, set the GPU to max performance, tried the V4L2, ffmpeg and VLC inputs in OBS, ... The video is sometimes butter smooth but then starts to stutter ever so slightly, and this goes back and forth over time. I don't even think that this is a problem specific to OBS, but rather that it might be an issue of the cam providing framerates (or timestamps) that are slightly off, and that this is a problem in combination with displaying/rendering the video at 60 FPS, since OBS has to synchronize the input with what it renders. I tried adding the source using obs-gstreamer in the most straight forward way:
v4l2src device=/dev/video6 ! image/jpeg,framerate=60/1,width=1280,height=720 ! jpegdec ! video.
(I need to use image/jpeg to set the cam to mjpeg so I get 60 or even 90 FPS, but that has always been the case.) The input works great, but it has exactly the same stuttering issues like e.g. the normal V4L2 source. Do you have any idea or hint what I could attempt in the gstreamer pipeline to mitigate this issue? Somehow I wish that it would just pull frames as fast as it can, and if it has to correct for discrepancies in the framerate, then it should do so as seldom as possible instead of making the video stutter for a prolonged time.
2. The second, more general, challenge that I have is that I would like to record the different input sources (i.e. the two or more V4L2 devices) separately to disk like I did before for later editing, and at the same time compose them in OBS for streaming. Now the general problem with this is that each of the V4L2 devices can only be opened once. The "go-to" idea is to use the v4l2loopback module to make "copies" of the V4L2 devices and use those in two separate programs. But if you dig a bit deeper into that direction, your sanity may suffer. For example, ffmpeg V4L2 output only supports the write() method instead of mmap which drops all timestamps and does not allow for smooth playback. Furthermore, I believe after really trying and digging into the code that v4l2loopback itself has bugs which lead to race conditions especially with relatively high framerates. While gstreamer-launch is able to push video to v4l2loopback devices better than ffmpeg does, frames read from the v4l2loopback devices are sometimes corrupted. Therefore I am looking for other possibilities to split the input devices in some way in order to record them and use them as sources in OBS at the same time.
My vague ideas up to this point include capturing the V4L2 devices and streaming them using local streaming via UDP or TCP, however latency and reliability really was an issue in my experiments, but maybe you have some hints for making this possible. Another idea was to use FIFO pseudo-files and maybe a buffering program in between to forward video between processes. All in all, I'm looking for ideas on how to share a V4L2 device either between two OBS instances, or OBS and some external program. Or maybe the recording of the source could even be done as part of the gstreamer pipeline in OBS, but I'm not sure if this is very convenient.
Okay, this posting has become much longer than I expected, sorry about that. :D But like I said, I have a feeling that maybe you guys are the right persons to ask about these challenges, and maybe obs-gstreamer can help in some way. Thanks again for this plugin!