Quiero transmitir la pantalla de mi computadora a otros usando gstreamer y generar una dirección rtsp para usar en Opencv.
Probé algunos ejemplos para transmitir la cámara web entre computadoras y funciona correctamente:
Command on the remote computer
gst-launch-1.0 -v v4l2src device=/dev/video1 ! "image/jpeg,width=1280, height=720,framerate=30/1" ! rtpjpegpay ! udpsink host=192.168.1.17 port=5001
Command on the local computer
gst-launch-1.0 -e -v udpsrc port=5001 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
Vi un ejemplo que muestra el escritorio actual, pero ahora no sé cómo crear una dirección rtsp a partir de él y cuando solo quiero transmitir el escritorio actual no obtengo nada en la otra terminal:
gst-launch-1.0 -v ximagesrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtpmp4vpay ! udpsink host=192.168.1.17 port=5000
NOTHING APPEAR IN THAT TERMINAL
gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
Aquí está el resultado que tengo en la primera terminal:
gst-launch-1.0 -v ximagesrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=127.0.0.1 port=5000
Définition du pipeline à PAUSED...
Le pipeline est actif et n’a pas besoin de phase PREROLL…
Passage du pipeline à la phase PLAYING…
New clock: GstSystemClock
/GstPipeline:pipeline0/GstXImageSrc:ximagesrc0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, format=(string)Y444, pixel-aspect-ratio=(fraction)1/1
Redistribution de latence…
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, format=(string)Y444, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, codec_data=(buffer)01f40028ffe1001e67f40028919680780227e27016a02020280000030008000003014478c19501000468ef3192, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)high-4:4:4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)f40028, sprop-parameter-sets=(string)"Z/QAKJGWgHgCJ+JwFqAgICgAAAMACAAAAwFEeMGV\,aO8xkg\=\=", payload=(int)96, ssrc=(uint)1850617788, timestamp-offset=(uint)3214623554, seqnum-offset=(uint)16401, a-framerate=(string)20
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)f40028, sprop-parameter-sets=(string)"Z/QAKJGWgHgCJ+JwFqAgICgAAAMACAAAAwFEeMGV\,aO8xkg\=\=", payload=(int)96, ssrc=(uint)1850617788, timestamp-offset=(uint)3214623554, seqnum-offset=(uint)16401, a-framerate=(string)20
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, codec_data=(buffer)01f40028ffe1001e67f40028919680780227e27016a02020280000030008000003014478c19501000468ef3192, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)high-4:4:4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 3214623853
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 16401
Y en el terminal receptor:
gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
Définition du pipeline à PAUSED...
Le pipeline est actif et n’a pas besoin de phase PREROLL…
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
Passage du pipeline à la phase PLAYING…
New clock: GstSystemClock
Respuesta1
Estás utilizando rtpmp4vpay
para el lado de codificación, pero rtph264depay
para el receptor. Intente usarlo rtph264pay
para empaquetar en paquetes RTP. Incluso me sorprende que funcione con x624enc que genera una transmisión h264...
Intente eliminar también decodebin
lo que no debería ser útil aquí.