[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <5069963.GXAFRqVoOG@natalenko.name>
Date: Tue, 03 Feb 2026 15:38:06 +0100
From: Oleksandr Natalenko <oleksandr@...alenko.name>
To: Laurent Pinchart <laurent.pinchart@...asonboard.com>
Cc: Gergo Koteles <soyer@....hu>,
Mauro Carvalho Chehab <mchehab+huawei@...nel.org>,
Jarkko Sakkinen <jarkko@...nel.org>, linux-media@...r.kernel.org,
jani.nikula@...ux.intel.com, anisse@...ier.eu,
Mauro Carvalho Chehab <mchehab@...nel.org>,
Hans Verkuil <hverkuil@...nel.org>,
Sakari Ailus <sakari.ailus@...ux.intel.com>,
Jacopo Mondi <jacopo.mondi@...asonboard.com>,
Ricardo Ribalda <ribalda@...omium.org>,
open list <linux-kernel@...r.kernel.org>,
Nicolas Dufresne <nicolas@...fresne.ca>
Subject: Re: [RFC PATCH] media: Virtual camera driver
On úterý 3. února 2026 2:23:13, středoevropský standardní čas Laurent Pinchart wrote:
> Hi Oleksandr,
>
> (Cc'ing Nicolas Dufresne)
>
> On Mon, Feb 02, 2026 at 12:45:15PM +0100, Oleksandr Natalenko wrote:
> > On pondělí 2. února 2026 12:40:12, středoevropský standardní čas Laurent Pinchart wrote:
> > > > If I understand correctly, it would be more forward-thinking to develop
> > > > virtual camera support in PipeWire rather than in the kernel.
> > >
> > > I don't think there's even a need for development in PipeWire
> > >
> > > $ gst-launch-1.0 \
> > > videotestsrc ! \
> > > video/x-raw,format=YUY2 ! \
> > > pipewiresink mode=provide stream-properties="properties,media.class=Video/Source,media.role=Camera"
> > >
> > > This gives me a virtual camera in Firefox. Extending the GStreamer
> > > pipeline to get the video stream from the network should be quite
> > > trivial.
> >
> > So far, I came up with this:
> >
> > * sender:
> >
> > $ gst-launch-1.0 pipewiresrc path=<webcam_id> ! image/jpeg, width=1280, height=720, framerate=24/1 ! rndbuffersize max=1400 ! udpsink host=<receiver_host> port=<receiver_port>
> >
> > * receiver:
> >
> > $ gst-launch-1.0 udpsrc address=<receiver_host> port=<receiver_port> ! queue ! image/jpeg, width=1280, height=720, framerate=24/1 ! jpegparse ! jpegdec ! pipewiresink mode=provide stream-properties="properties,media.class=Video/Source,media.role=Camera" client-name=VirtualCam
> >
> > Please let me know if I do something dumb here. Trial and error to
> > make this work took a couple of hours for me, but it seems to provide
> > what I need.
>
> There's nothing dumb at all, especially given that it works :-) I have
> been able to reproduce it locally (using a different pipeline on the
> sender side).
>
> I compared your pipelines with another JPEG-over-UDP setup I used a
> while ago, which used an rtpjpegpay element before udpsink on the sender
> side to encapsulate the payload in RTP packets, and an rtpjpegdepay
> element on the receiver side after udpsrc. This helps the receiver
> synchronize with the sender if the sender is started first. The full
> pipelines are
>
> * Sender:
>
> gst-launch-1.0 \
> v4l2src ! \
> video/x-raw,pixelformat=YUYV,size=640x480 ! \
> jpegenc ! \
> rtpjpegpay ! \
> udpsink host=192.168.10.200 port=8000
>
> * Receiver:
>
> gst-launch-1.0 \
> udpsrc port=8000 ! \
> application/x-rtp,encoding-name=JPEG,payload=26 ! \
> rtpjpegdepay ! \
> jpegdec ! \
> video/x-raw,pixelformat=YUYV,size=640x480 ! \
> queue ! \
> pipewiresink mode=provide \
> stream-properties="properties,media.class=Video/Source,media.role=Camera" \
> client-name="Remote Camera"
>
> Unfortunatley this doesn't work, when the pipewire client connects to
> the stream on the receiver side I get
>
> ERROR: from element /GstPipeline:pipeline0/GstPipeWireSink:pipewiresink0: stream error: no more input formats
>
> Nicolas, would you have any wisdom to share about this and tell me if I
> did something dumb ? :-) There's no hurry.
Just to share my current state of affairs:
* sender:
$ gst-launch-1.0 pipewiresrc path=<webcam_id> ! video/x-h264, width=1280, height=720, framerate=24/1 ! rtph264pay ! rtpstreampay ! udpsink host=<receiver_host> port=<receiver_port>
* receiver:
$ gst-launch-1.0 udpsrc address=<receiver_host> port=<receiver_port> ! queue ! application/x-rtp-stream,encoding-name=H264 ! rtpstreamdepay ! application/x-rtp,encoding-name=H264 ! rtph264depay ! h264parse ! openh264dec ! pipewiresink mode=provide stream-properties="properties,media.class=Video/Source,media.role=Camera" client-name=VirtualCam
I chose H.264 because of much lower (tenfold) traffic comparing to MJPEG, wrapped this into RTP, opted in for OpenH264 decoder because I read it was handling low latency streams better than avdec_h264, and tested this setup with both Firefox and Chromium, and it actually worked pretty reliably, so I'm impressed now.
The only issue I have with this thing is that once a tab with meeting in the browser is closed, the whole receiver pipeline stops gracefully because "PipeWire link to remote node was destroyed". I didn't find a way to tell the pipeline to just restart, so in fact I had to wrap it into a Python script with Gst.parse_launch() and friends, and add error message parsing to restart the pipeline inside the script.
Leaving this in public, because it's a straightforward and potentially widely used setup, yet there's little to no info on how to do it properly, and the knowledge is scattered across random posts of varying age.
--
Oleksandr Natalenko, MSE
Download attachment "signature.asc" of type "application/pgp-signature" (834 bytes)
Powered by blists - more mailing lists