obs-gstreamer

obs-gstreamer 0.4.1

Tuna

Member
Tuna submitted a new resource:

obs-gstreamer - Feed GStreamer launch pipelines to OBS Studio

An OBS Studio source plugin to feed GStreamer launch pipelines into OBS Studio.

This plugin has interesting use cases but may be difficult to understand and is clunky use if you are not familiar with GStreamer.

No binaries provided due to too many platforms potentially being supported plus the plugin needs to be compiled against the major version of the GStreamer installation on the target system anyway.

Read more about this resource...
 

Dregu

New Member
Cool! VLC source latency sucks with network streams. Now you can pipe video from a raspberry pi camera straight to obs with just a few ms latency.

Here's the command and pipeline I'm using (raspi archlinux arm -> obs archlinux):
Code:
/opt/vc/bin/raspivid -n -t 0 -w 1280 -h 720 -g 60 -fps 30 -b 2500000 -mm average -ex fixedfps -ih -pf high -o udp://10.0.0.176:5000

udpsrc port=5000 ! h264parse ! avdec_h264 ! video.
 

Dregu

New Member
Also video+audio from OBS (windows) to OBS (linux) over the internet works great (<1s delay.) Just "Custom Output (FFmpeg)" to url udp://my-ip:5000, (options: mux mpegts, video h264, audio aac) and this pipeline:

Code:
udpsrc port=5000 ! queue ! decodebin name=bin ! queue ! video. bin. ! queue ! audio.
 

Tuna

Member
Glad you guys like it. There is another example on the Github issue page where someone streams over TCP as well. The use cases depend very much on your imagination. Basically this plugin is a bridge to the GStreamer SDK. So you can use it to do all sorts of things, streaming from other devices is one thing - but you may also try to access hardware devices (or capture screens) directly which may not be covered by the default plugin set of OBS.
 

Tuna

Member
Thanks Tuna.

I have build the version for Windows and included some instructions how I got it working. If somebody just wants to test it.

https://github.com/Fabian-Schmidt/obs-gstreamer/releases/tag/v0.0.4-w
Thanks, thats awesome. I haven't gotten around to setup binary releases. I felt if I didn't set it up automatically I keep running updating all the time - either if I release something new or GStreamer versions are released. But hopefully it is API compatible with future GStreamer release and I will slow down at updating stuff too..
 

singleballplay

New Member
Awesome. This is just what I've been looking for and very excited to play with it tonight. I'm messing around with webcams running on Raspberry Pi's using GStreamer RTSPServer and the MediaSource plugin just isn't cutting it for latency so I'm hoping this does the trick.
 

Ochi

New Member
This is an astonishing piece of code, thank you for this! I've been searching everywhere for solutions to two problems that I'm trying to solve. I'm not yet sure if this will deliver solutions for both issues, but either way, it's a great addition to the options that I have.

I thought maybe this is a good place to ask you guys if you have any creative ideas regarding my use case.

My situation is as follows: I have a Magewell HDMI capture card and a Logitech Brio webcam under Arch Linux which I use for recording Let's Plays (sometimes I also use a second Magewell and another Logitech Cam, but let's restrict ourselves to the simpler case). Up until now, I've used two ffmpeg instances to record both V4L2 input devices (plus audio stuff) to disk for later editing. This worked great so far, with NVENC for fast encoding, and smooth 60 FPS for both the cam and game capture.

The problems began with me getting more into streaming. In principle, OBS is great for that, however, I have two problems:

1. I tried (almost?) everything to get the Logitech Brio webcam to give me smooth 60 FPS playback in OBS. The cam parameters are (I think) carefully tuned using qv4l2, I always reduce latency using the /dev/cpu_dma_latency, set the GPU to max performance, tried the V4L2, ffmpeg and VLC inputs in OBS, ... The video is sometimes butter smooth but then starts to stutter ever so slightly, and this goes back and forth over time. I don't even think that this is a problem specific to OBS, but rather that it might be an issue of the cam providing framerates (or timestamps) that are slightly off, and that this is a problem in combination with displaying/rendering the video at 60 FPS, since OBS has to synchronize the input with what it renders. I tried adding the source using obs-gstreamer in the most straight forward way:

v4l2src device=/dev/video6 ! image/jpeg,framerate=60/1,width=1280,height=720 ! jpegdec ! video.

(I need to use image/jpeg to set the cam to mjpeg so I get 60 or even 90 FPS, but that has always been the case.) The input works great, but it has exactly the same stuttering issues like e.g. the normal V4L2 source. Do you have any idea or hint what I could attempt in the gstreamer pipeline to mitigate this issue? Somehow I wish that it would just pull frames as fast as it can, and if it has to correct for discrepancies in the framerate, then it should do so as seldom as possible instead of making the video stutter for a prolonged time.

2. The second, more general, challenge that I have is that I would like to record the different input sources (i.e. the two or more V4L2 devices) separately to disk like I did before for later editing, and at the same time compose them in OBS for streaming. Now the general problem with this is that each of the V4L2 devices can only be opened once. The "go-to" idea is to use the v4l2loopback module to make "copies" of the V4L2 devices and use those in two separate programs. But if you dig a bit deeper into that direction, your sanity may suffer. For example, ffmpeg V4L2 output only supports the write() method instead of mmap which drops all timestamps and does not allow for smooth playback. Furthermore, I believe after really trying and digging into the code that v4l2loopback itself has bugs which lead to race conditions especially with relatively high framerates. While gstreamer-launch is able to push video to v4l2loopback devices better than ffmpeg does, frames read from the v4l2loopback devices are sometimes corrupted. Therefore I am looking for other possibilities to split the input devices in some way in order to record them and use them as sources in OBS at the same time.

My vague ideas up to this point include capturing the V4L2 devices and streaming them using local streaming via UDP or TCP, however latency and reliability really was an issue in my experiments, but maybe you have some hints for making this possible. Another idea was to use FIFO pseudo-files and maybe a buffering program in between to forward video between processes. All in all, I'm looking for ideas on how to share a V4L2 device either between two OBS instances, or OBS and some external program. Or maybe the recording of the source could even be done as part of the gstreamer pipeline in OBS, but I'm not sure if this is very convenient.

Okay, this posting has become much longer than I expected, sorry about that. :D But like I said, I have a feeling that maybe you guys are the right persons to ask about these challenges, and maybe obs-gstreamer can help in some way. Thanks again for this plugin!
 

Tuna

Member
1. I would at least try to add some queue elements to the pipeline so you will get some buffering in case downstream is blocking. If may not help with the stuttering though. You can try to use pipeline timestamps from the options and the samples will get synced to the system clock potentially helping in playback smoothness. It may add latency though - especially you may encounter sync issues after recording longer periods of time. But ultimately it depends on how smooth and consistent v4l2 and the camera are delivering frames. It can be further improved, but it requires all components synced to the same clock - which is not trivial to achieve.

Code:
v4l2src device=/dev/video6 ! image/jpeg,framerate=60/1,width=1280,height=720 ! queue ! jpegdec ! video.

2. You can share the memory/image data the camera delivers. Use tee elements and shm elements. For example, make an external app (or use gst-launch-1.0) that runs a similar pipeline:

Code:
v4l2src ! image/jpeg ! tee name=tee \
  tee. ! queue ! jpegdec ! gdppay ! shmsink \
  tee. ! queue ! matroskamux ! filesink

And you can pick up the image from the shm socket with this plugin from OBS. (there is another linux plugin here that captures GNOME screens which basically uses the same idea if you need a reference)

Code:
shmsrc ! gdpdepay! video.

Or of course you can directly save it to disk too directly in the plugin pipeline:

Code:
v4l2src ! image/jpeg ! tee name=tee \
  tee. ! queue ! jpegdec ! video. \
  tee. ! queue ! matroskamux ! filesink
 
Last edited:

Ochi

New Member
Thank you very much for your response! I wasn't aware of the shmsrc/sink and the gdppay/depay elements, I guess it doesn't get much closer than that to what I'm trying to achieve. I was able to show the sources tiled next to each other in one OBS instance for recording, and at the same time forward them to another OBS instance for composing the sources for streaming. It seems to be quite reliable, also when I change something in the master instance and the slave needs to reconnect. I had to disable "Sync appsinks to clock" in the slave's settings though, otherwise both ends become incredibly laggy.

(Also the two OBS instances with one recording in 4K 60 FPS on the GPU and one encoding for streaming on the CPU are borderline intense on system resources when much is happening on screen, but that's another issue. ;) )

What I also did was to compile the gstreamer "entrans" family of plugins in an attempt to fix the stuttering using the "stamp" plugin, but not really successfully yet. And I (re)compiled gst-plugins-bad to include NVENC capabilities since I'm playing around with the possibility to record outside of OBS on the command line and forwarding the video to the streaming OBS instance.

I saw that there are also other "inter process" sinks/sources like "ipcpipeline", however I'm not sure how to connect the fdin/fdout properties in a pipeline on the command line... just out of curiosity, is there a way to do this? I really like the possibilities of gstreamer, but I'm just getting started to understand them. :)

Thank you for your help, much appreciated!
 

Tuna

Member
GStreamer can do a lot. Feel to enter the adventurous path down the line. But all this plugin does is to offer you a way to feed some audio and video data into OBS ;-)

The more complex you go you may it also find it helpful to modify the plugin altogether - it really does the absolute minimal thing for what it does..
 

Denton

New Member
Hi, I'd very much like to use GStreamer into Windows OBS with a Raspberry PI. My initial try with the 0.0.4 binaries kindly compiled resulted in a BSOD.
I thought I'd have a go compiling it - and saw that it needs latest Gstreamer. After some uphill, I got GStreamer 1.14.4 compiled and installed using Visual Studio 2017 and MinGW.

However, I'm missing something since when I try to build obs-gstreamer - I get:

$ meson --buildtype=release build
The Meson build system
Version: 0.48.1
Source dir: C:\MinGW\msys\1.0\home\admin\obs-gstreamer-master
Build dir: C:\MinGW\msys\1.0\home\admin\obs-gstreamer-master\build
Build type: native build
Project name: obs-gstreamer-source
Project version: undefined
Native C compiler: gcc (gcc 6.3.0 "gcc (MinGW.org GCC-6.3.0-1) 6.3.0")
Build machine cpu family: x86
Build machine cpu: x86
Compiler for C supports link arguments -static-libgcc: YES

meson.build:30:0: ERROR: C library 'obs' not found

A full log can be found at C:\MinGW\msys\1.0\home\admin\obs-gstreamer-master\build\meson-logs\meson-log.txt

Do you have any instructions how to compile for Windows please?

Regards
 

Tuna

Member
First of all you could have just installed the binary release of GStreamer from their site. That gets a lot of hassle compiling GStreamer yourself out of the way ;-)

You are currently stuck at finding the the OBS required development files. I rely on meson's pkg-config search combination to do that. That may be a little more tricky on the Windows side, because well Windows does not support this system.

To be honest. I think your best bet is to create a Visual Studio project or Makefile by hand and add the locations by hand , libraries and header files etc.. Sorry if thats a little inconvenient.. but Windows is just too annoying in that regard - I'm open for suggestions.

P.S. Since this is all in userspace a BSOD should never happen..
 

Tuna

Member
Tuna updated obs-gstreamer with a new update entry:

0.0.6

- Fixed some small bugs I think
- Accept RGBx and BGRx color formats
- Experimental prebuilt plugins

The prebuilt libraries are untested. If they don't work for you, please try to figure out why and report. I suspect there may be problems with dependency libraries to be found on the target system as they may be located on different ones as on the build machine..

Read the rest of this update entry...
 

Burgs

New Member
Greetings,

I am brand new to OBS and I saw this plugin that got my attention. I would like to ask how to use it to feed a Gstreamer pipeline into OBS? I assume the pipeline is used as a source however I'm not sure how to go about configuring OBS to mount it...something simple such as "gst-launch-1.0.exe videotestsrc ! autovideosink" I suppose is a good place to start. I thank you in advance for any advice. Regards.
 

Tuna

Member
If you check out the README.md there is one example for that. Actually it is the default pipeline when you add the source to a scene.

A little more simplified/adjusted to your video only example this would be something like that:

Code:
videotestsrc is-live=true ! video.
 
Top