[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <20260203012313.GA280953@killaraus>
Date: Tue, 3 Feb 2026 03:23:13 +0200
From: Laurent Pinchart <laurent.pinchart@...asonboard.com>
To: Oleksandr Natalenko <oleksandr@...alenko.name>
Cc: Gergo Koteles <soyer@....hu>,
Mauro Carvalho Chehab <mchehab+huawei@...nel.org>,
Jarkko Sakkinen <jarkko@...nel.org>, linux-media@...r.kernel.org,
jani.nikula@...ux.intel.com, anisse@...ier.eu,
Mauro Carvalho Chehab <mchehab@...nel.org>,
Hans Verkuil <hverkuil@...nel.org>,
Sakari Ailus <sakari.ailus@...ux.intel.com>,
Jacopo Mondi <jacopo.mondi@...asonboard.com>,
Ricardo Ribalda <ribalda@...omium.org>,
open list <linux-kernel@...r.kernel.org>,
Nicolas Dufresne <nicolas@...fresne.ca>
Subject: Re: [RFC PATCH] media: Virtual camera driver
Hi Oleksandr,
(Cc'ing Nicolas Dufresne)
On Mon, Feb 02, 2026 at 12:45:15PM +0100, Oleksandr Natalenko wrote:
> On pondělí 2. února 2026 12:40:12, středoevropský standardní čas Laurent Pinchart wrote:
> > > If I understand correctly, it would be more forward-thinking to develop
> > > virtual camera support in PipeWire rather than in the kernel.
> >
> > I don't think there's even a need for development in PipeWire
> >
> > $ gst-launch-1.0 \
> > videotestsrc ! \
> > video/x-raw,format=YUY2 ! \
> > pipewiresink mode=provide stream-properties="properties,media.class=Video/Source,media.role=Camera"
> >
> > This gives me a virtual camera in Firefox. Extending the GStreamer
> > pipeline to get the video stream from the network should be quite
> > trivial.
>
> So far, I came up with this:
>
> * sender:
>
> $ gst-launch-1.0 pipewiresrc path=<webcam_id> ! image/jpeg, width=1280, height=720, framerate=24/1 ! rndbuffersize max=1400 ! udpsink host=<receiver_host> port=<receiver_port>
>
> * receiver:
>
> $ gst-launch-1.0 udpsrc address=<receiver_host> port=<receiver_port> ! queue ! image/jpeg, width=1280, height=720, framerate=24/1 ! jpegparse ! jpegdec ! pipewiresink mode=provide stream-properties="properties,media.class=Video/Source,media.role=Camera" client-name=VirtualCam
>
> Please let me know if I do something dumb here. Trial and error to
> make this work took a couple of hours for me, but it seems to provide
> what I need.
There's nothing dumb at all, especially given that it works :-) I have
been able to reproduce it locally (using a different pipeline on the
sender side).
I compared your pipelines with another JPEG-over-UDP setup I used a
while ago, which used an rtpjpegpay element before udpsink on the sender
side to encapsulate the payload in RTP packets, and an rtpjpegdepay
element on the receiver side after udpsrc. This helps the receiver
synchronize with the sender if the sender is started first. The full
pipelines are
* Sender:
gst-launch-1.0 \
v4l2src ! \
video/x-raw,pixelformat=YUYV,size=640x480 ! \
jpegenc ! \
rtpjpegpay ! \
udpsink host=192.168.10.200 port=8000
* Receiver:
gst-launch-1.0 \
udpsrc port=8000 ! \
application/x-rtp,encoding-name=JPEG,payload=26 ! \
rtpjpegdepay ! \
jpegdec ! \
video/x-raw,pixelformat=YUYV,size=640x480 ! \
queue ! \
pipewiresink mode=provide \
stream-properties="properties,media.class=Video/Source,media.role=Camera" \
client-name="Remote Camera"
Unfortunatley this doesn't work, when the pipewire client connects to
the stream on the receiver side I get
ERROR: from element /GstPipeline:pipeline0/GstPipeWireSink:pipewiresink0: stream error: no more input formats
Nicolas, would you have any wisdom to share about this and tell me if I
did something dumb ? :-) There's no hurry.
--
Regards,
Laurent Pinchart
Powered by blists - more mailing lists