gstreamer를 사용하여 화면을 원격 컴퓨터로 스트리밍하는 방법

gstreamer를 사용하여 화면을 원격 컴퓨터로 스트리밍하는 방법

gstreamer를 사용하여 내 컴퓨터 화면을 다른 컴퓨터로 스트리밍하고 Opencv에서 사용할 rtsp 주소를 생성하고 싶습니다.

컴퓨터 간에 웹캠을 스트리밍하는 몇 가지 예를 시도했는데 제대로 작동합니다.

Command on the remote computer
gst-launch-1.0 -v v4l2src device=/dev/video1 ! "image/jpeg,width=1280, height=720,framerate=30/1" ! rtpjpegpay ! udpsink host=192.168.1.17 port=5001

Command on the local computer
gst-launch-1.0 -e -v udpsrc port=5001 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink

현재 데스크톱을 표시하는 몇 가지 예를 보았지만 지금은 rtsp 주소를 생성하는 방법을 모르고 현재 데스크톱을 스트리밍하려고 할 때 다른 터미널에서는 아무것도 얻지 못합니다.

gst-launch-1.0 -v ximagesrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtpmp4vpay  ! udpsink host=192.168.1.17 port=5000


NOTHING APPEAR IN THAT TERMINAL
gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink

첫 번째 터미널에서 얻은 결과는 다음과 같습니다.

gst-launch-1.0 -v ximagesrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay  ! udpsink host=127.0.0.1 port=5000
Définition du pipeline à PAUSED...
Le pipeline est actif et n’a pas besoin de phase PREROLL…
Passage du pipeline à la phase PLAYING…
New clock: GstSystemClock
/GstPipeline:pipeline0/GstXImageSrc:ximagesrc0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, format=(string)Y444, pixel-aspect-ratio=(fraction)1/1
Redistribution de latence…
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, format=(string)Y444, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)BGRx, width=(int)1920, height=(int)1080, framerate=(fraction)20/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, codec_data=(buffer)01f40028ffe1001e67f40028919680780227e27016a02020280000030008000003014478c19501000468ef3192, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)high-4:4:4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)f40028, sprop-parameter-sets=(string)"Z/QAKJGWgHgCJ+JwFqAgICgAAAMACAAAAwFEeMGV\,aO8xkg\=\=", payload=(int)96, ssrc=(uint)1850617788, timestamp-offset=(uint)3214623554, seqnum-offset=(uint)16401, a-framerate=(string)20
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)f40028, sprop-parameter-sets=(string)"Z/QAKJGWgHgCJ+JwFqAgICgAAAMACAAAAwFEeMGV\,aO8xkg\=\=", payload=(int)96, ssrc=(uint)1850617788, timestamp-offset=(uint)3214623554, seqnum-offset=(uint)16401, a-framerate=(string)20
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, codec_data=(buffer)01f40028ffe1001e67f40028919680780227e27016a02020280000030008000003014478c19501000468ef3192, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)high-4:4:4, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 3214623853
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 16401

그리고 수신자 터미널에서:

gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
Définition du pipeline à PAUSED...
Le pipeline est actif et n’a pas besoin de phase PREROLL…
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
Passage du pipeline à la phase PLAYING…
New clock: GstSystemClock



답변1

rtpmp4vpay인코딩 측에는 사용하고 있지만 rtph264depay수신기에는 사용하고 있습니다. rtph264payRTP 패킷으로 패키징하는 데 사용해 보세요 . h264 스트림을 출력하는 x624enc에서도 작동한다는 사실에 놀랐습니다.

decodebin여기서 유용하지 않은 것을 제거해 보세요 .

관련 정보