GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2023-07-04T16:01:20Zhttps://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/941qtmux/qtdemux : Not able to stitch back the fragmented input MP4 file using s...2023-07-04T16:01:20ZShiva Kumarqtmux/qtdemux : Not able to stitch back the fragmented input MP4 file using splitmuxsrcHi ,
When we use "splitmuxsink" with "qtmux" to split input files into fragments and try to stitch them back to original MP4 file, "qtmux" is giving "Buffer has no PTS." error.
Command 1:
gst-launch-1.0 filesrc location=bbb_sunflower_...Hi ,
When we use "splitmuxsink" with "qtmux" to split input files into fragments and try to stitch them back to original MP4 file, "qtmux" is giving "Buffer has no PTS." error.
Command 1:
gst-launch-1.0 filesrc location=bbb_sunflower_1080p_60fps_normal.mp4 ! qtdemux ! splitmuxsink muxer=qtmux max-size-time=80000000000 location=tmpfile%02d.mp4
Command 2:
gst-launch-1.0 splitmuxsrc location=tmpfile0* ! h264parse ! qtmux ! filesink location=output.mp4
Command 2, using "splitmuxsrc" is giving "Buffer has no PTS." error.
On doing some basic analysis, I figured out that "h264parse" before "qtmux" is receiving duplicate PTS. Hence "h264parse" producing a buffer with PTS as "none". This is causing "qtmux" to give "Buffer has no PTS." error.
Analysis:
gst-launch-1.0 splitmuxsrc location=tmpfile0* ! identity silent=false ! h264parse ! qtmux ! filesink location=del.mp4 -v
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (1919 bytes, dts: 0:01:19.783333333, pts: 0:01:19.800000000, duration: 0:00:00.016666667, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7f5cfc02e000
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (8674 bytes, dts: 0:01:19.800000000, pts: 0:01:19.883333333, duration: 0:00:00.016666667, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7f5cfc02c360
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (1935 bytes, dts: 0:01:19.816666667, pts: 0:01:19.850000000, duration: 0:00:00.016666666, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7f5cfc02e5a0
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (1717 bytes, dts: 0:01:19.833333333, pts: 0:01:19.833333333, duration: 0:00:00.016666667, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7f5cfc02c000
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (1492 bytes, dts: 0:01:19.850000000, pts: **0:01:19.866666667**![bbb_sunflower_1080p_60fps_normal](/uploads/c7060c5b4c86fb47c31e2d8165aa1a5e/bbb_sunflower_1080p_60fps_normal.mp4), duration: 0:00:00.016666667, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7f5cfc0297e0
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = event ******* (identity0:sink) E (type: tag (20510), GstTagList-stream, taglist=(taglist)"taglist\,\ video-codec\=\(string\)\"H.264\\\ /\\\ AVC\"\,\ maximum-bitrate\=\(uint\)19727256\,\ bitrate\=\(uint\)4001448\;";) 0x7f5cec002b20
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (289041 bytes, dts: 0:01:19.833333334, pts: **0:01:19.866666667**, duration: 0:00:00.016666666, offset: -1, offset_end: -1, flags: 00000000 , meta: none) 0x7f5cf0006c60
/GstPipeline:pipeline0/GstIdentity:identity0: last-message = chain ******* (identity0:sink) (5777 bytes, dts: 0:01:19.850000000, pts: 0:01:19.933333334, duration: 0:00:00.016666667, offset: -1, offset_end: -1, flags: 00002000 delta-unit , meta: none) 0x7f5cf00067e0
ERROR: from element /GstPipeline:pipeline0/GstQTMux:qtmux0: Could not multiplex stream.
Additional debug info:
gstqtmux.c(4832): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstQTMux:qtmux0:
Buffer has no PTS.
Execution ended after 0:00:00.240679964
If I use "matroskamux" in place of "qtmux" in "Command 1", then everything works fine.https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/944matroska-mux: Add GstVideoCodecAlpha support2023-07-04T15:54:17ZAlexandr Topilskimatroska-mux: Add GstVideoCodecAlpha supportWebM alpha mode is now supported for demuxing and decoding, the next logical step would be to support encoding and muxing. For this to happen, we'll need an encoder wrapper similiar to what is used for decoders. Inside the bin, we'll nee...WebM alpha mode is now supported for demuxing and decoding, the next logical step would be to support encoding and muxing. For this to happen, we'll need an encoder wrapper similiar to what is used for decoders. Inside the bin, we'll need, alphasplit, to separate the colors and the alpha planes. This implementation will likely need to behave differently depending on the encoder used. Then two encoder instances (unmodified) can be used and a codecalphamux element will be used to put the alpha channel in the color GstBuffer using GstVideoCodecAlphaMeta (and of course update the caps).
Optionally, encoder API can supports alpha internally can simply expose such support without wrapper.
With this in place, Matroska mixer will need to add the alpha mode tag, and attach the auxiliary alpha data.https://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/2Application Development Manual: add gst-launch crash course before diving int...2023-07-04T15:47:05ZBugzilla Migration UserApplication Development Manual: add gst-launch crash course before diving into code## Submitted by Tim Müller `@tpm`
**[Link to original bug (#707586)](https://bugzilla.gnome.org/show_bug.cgi?id=707586)**
## Description
Ian Davidson suggested this:
On the GStreamer web site it says “Application Development Ma...## Submitted by Tim Müller `@tpm`
**[Link to original bug (#707586)](https://bugzilla.gnome.org/show_bug.cgi?id=707586)**
## Description
Ian Davidson suggested this:
On the GStreamer web site it says “Application Development Manual (Read this first) ” - so that would seem to be the place to start if you want to learn about Gstreamer.
Very early in the document (section 2.1), it says that “The programmer can use an extensive set of powerful tools to create media pipelines without writing a single line of code. ” That is good to know and is brought about by the library of 'Plug-ins'.
But – then as you continue to read the manual, you are thrown heavily into programming. Straight away.
Might I suggest that very early on you have mention of gst-launch – since, using that you can do things without having to write a single line of coding. However, the chapter on gst-launch itself is not an easy-to-read chapter: It starts with a 'simple commandline' and then shows a more complex one – but without any explanation. If we take the first example
gst-launch filesrc location=hello.mp3 ! mad ! audioresample ! osssink
you could then describe what is happening. e.g.:
gst-launch is a program which enables the user to construct pipelines using command-line parameters.
Filesrc is an element (or a plugin) – in this case it will read data from a file and needs to know the name of the file to open. It will output the data so as to be the source for the next element in the pipe-line.
The “!” symbol separates the first element from the next.
mad is the next element in the pipe-line: It will decode mp3 data. It picks up the source provided by the previous element and then outputs the decoded data for the next element in the pipe-line.
Once again, a “!” symbol separates the elements.
audioresample resamples the Audio. (I don’t know why this is a benefit – it could be explained)
Another “!”
osssink takes the audio signal and sends it to an output device which supports (or is supported by) OSS.
The second example could then be similarly explained – which would be a useful exercise since the single vob file is being demuxed with part of the data going one way and the rest another. A reference, at this point to the Overview of available plug-ins would be beneficial. Perhaps an example where more options need to be specified could also be explained.
Then you can say that, if you need to build this into an application, you can do the same stuff with code and if you need to do something which is not currently supported, you can write your own plug-in – so read on...
I hope this is usefulhttps://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/7Broken link to GstBuffer - need redirects from old API pages to new docs2023-07-04T15:45:06ZBugzilla Migration UserBroken link to GstBuffer - need redirects from old API pages to new docs## Submitted by Marie Maurer
**[Link to original bug (#784386)](https://bugzilla.gnome.org/show_bug.cgi?id=784386)**
## Description
Go to:
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-libs/html/gst...## Submitted by Marie Maurer
**[Link to original bug (#784386)](https://bugzilla.gnome.org/show_bug.cgi?id=784386)**
## Description
Go to:
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-libs/html/gst-plugins-base-libs-gstvideoutils.html#GstVideoCodecFrame
Now click on GstBuffer (e.g. from “GstBuffer *output_buffer;”)
This links to
https://gstreamer.freedesktop.org/usr/share/gtk-doc/html/gstreamer-1.0GstBuffer.html#GstBuffer-struct
which is wrong.
Here is the error: "gstreamer-1.0GstBuffer.html"
Hand-generated? Error in tool (GTK-Doc V1.25) to generate documentation?
Or struct too difficult for tool to follow?https://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/101Support for Ubuntu 22.04 LTS2023-07-04T15:43:22ZTriptSupport for Ubuntu 22.04 LTSUbuntu 22.04 LTS Gstreamer installation error
```
Package gstreamer1.0-doc is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
...Ubuntu 22.04 LTS Gstreamer installation error
```
Package gstreamer1.0-doc is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
E: Package 'gstreamer1.0-doc' has no installation candidate
```
I'm using kernel 5.15 generic.
I used the following command to install it.
```
sudo apt-get install libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libgstreamer-plugins-bad1.0-dev gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav gstreamer1.0-doc gstreamer1.0-tools gstreamer1.0-x gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio
```https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1181webrtcbin: Probably needs a way to close the connection to the peer2023-07-04T14:13:39ZSebastian Drögewebrtcbin: Probably needs a way to close the connection to the peerBasically mapping to the [`close()`](https://www.w3.org/TR/webrtc/#dom-rtcpeerconnection-close) function of the peer connection. Among other things this would cause orderly shutdown/closing of any SCTP datachannel connections and of the ...Basically mapping to the [`close()`](https://www.w3.org/TR/webrtc/#dom-rtcpeerconnection-close) function of the peer connection. Among other things this would cause orderly shutdown/closing of any SCTP datachannel connections and of the DTLS connection (that can already be triggered with an EOS event).
CC @ystreethttps://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1556mpegtsmux: Stream packets can be sent prior to first PAT/PMT2023-07-04T10:43:43ZAndrew Gallmpegtsmux: Stream packets can be sent prior to first PAT/PMTSince 1.18, by default the first stream created (based on the pad creation order) is considered the PCR stream. Yet this PCR stream may not be the first stream whose buffers' are muxed, which results in writing stream data prior to writi...Since 1.18, by default the first stream created (based on the pad creation order) is considered the PCR stream. Yet this PCR stream may not be the first stream whose buffers' are muxed, which results in writing stream data prior to writing the PAT/PMT.
In 1.16, despite the logic introduced by https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/commit/3f0463c43e2c61ba5509c9466dc2a023dc866ad4, it isn't until a few lines below that the stream is initially [created](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/1.16/gst/mpegtsmux/mpegtsmux.c#L979). So despite logging that a PCR stream had been selected and calling `tsmux_program_set_pcr_stream` it did not have any effect. Afterwards the first stream used for output is selected as the PCR stream [here](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/1.16/gst/mpegtsmux/mpegtsmux.c#L1435).
So it appears quite intentional to choose the first program stream as the PCR stream. However, at least for my use case, it is common not to know if the first buffer will be audio or video until after linking the pipeline. In this case the behavior of 1.16 where it selects the first stream buffer for output as the PCR stream is needed.
I do hesitate to create an MR to remove that logic as I only have a limited familiarity with the code. Perhaps it would be better instead to ensure that the SI structures are written before the first stream data?
Reproduction:
`gst-launch-1.0 audiotestsrc num-buffers=1 timestamp-offset=1 ! avenc_aac ! aacparse ! .sink_101 mpegtsmux name=mux ! tsdemux ! fakesink videotestsrc num-buffers=1 ! x264enc ! mux.sink_102`
In this case audio is selected as the PCR stream as it is linked first. Yet due to audio's 'timestamp-offset=1' the first video buffer is sent before the PAT/PMT. If you enable logging in the demuxer there will be a bunch of "PID 0x0066 Saw packet on a pid we don't handle" messages.https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/47encodebin: problem with asf/asfmux caps2023-07-04T10:38:08ZBugzilla Migration Userencodebin: problem with asf/asfmux caps## Submitted by an unknown user
**[Link to original bug (#647821)](https://bugzilla.gnome.org/show_bug.cgi?id=647821)**
## Description
Noticed today that in order to enable transcoding to windows media/asf with Transmageddon I need ...## Submitted by an unknown user
**[Link to original bug (#647821)](https://bugzilla.gnome.org/show_bug.cgi?id=647821)**
## Description
Noticed today that in order to enable transcoding to windows media/asf with Transmageddon I need to not only create containerprofile for encodebin that is set to video/x-ms-asf, but to "video/x-ms-asf, parsed=true". (This is with parsers enabled in encodebin.c). I assume it should be able to figure this out without me added parsed=true to the caps?https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1778Incorrect media duration when attempting to stream a MPEG-TS media from URI2023-07-04T09:57:15ZSUHAIL KUTTYIncorrect media duration when attempting to stream a MPEG-TS media from URII am attempting to stream media from a URI, specifically a MPEG-TS format type file (with .ts extension) in GUI application. The video seems to playback faster than expected (almost like a fast forward 2x).
I ran the gst-discoverer-1.0 ...I am attempting to stream media from a URI, specifically a MPEG-TS format type file (with .ts extension) in GUI application. The video seems to playback faster than expected (almost like a fast forward 2x).
I ran the gst-discoverer-1.0 tool to retrieve information of the media file as read by the Gstreamer v1.18.5.
I have pasted output below. I noticed that in the Properties, duration is around 5mins 46secs, where as the file media duration is about 15mins, as depicted by the output from ffmpeg tool (and obviously the file media when played back on a commercial media player, such as VLC).
The output also indicates that there is a Gap detected by the mpegtspacketizer from the gst-bad-plugin.
How do I solve this problem?
See the entire output pasted from gst-discoverer-1.0 tool here:
```
WARN mpegtspacketizer mpegtspacketizer.c:1888:_set_current_group: GAP detected. diff 0:00:01.039988888
Done discovering http://localhost:8889/Test_4.ts
Properties:
Duration: 0:05:46.709459444
Seekable: yes
Live: no
Tags:
audio codec: MPEG-4 AAC
video codec: H.264
container: video/mpegts, systemstream=(boolean)true, packetsize=(int)188
audio: audio/mpeg, framed=(boolean)true, mpegversion=(int)4, level=(string)2, base-profile=(string)lc, profile=(string)lc, rate=(int)44100, channels=(int)1, stream-format=(string)adts
Tags:
audio codec: MPEG-4 AAC
Codec:
audio/mpeg, framed=(boolean)true, mpegversion=(int)4, level=(string)2, base-profile=(string)lc, profile=(string)lc, rate=(int)44100, channels=(int)1, stream-format=(string)adts
Stream ID: aebffa33bac2d8fa09151f41d3e003c39b8082d07d54f00d44ccfa8a95e5cc38:1/00000101
Language: <unknown>
Channels: 1 (unknown layout)
Sample rate: 44100
Depth: 32
Bitrate: 0
Max bitrate: 0
video: video/x-h264, stream-format=(string)avc, width=(int)704, height=(int)576, framerate=(fraction)0/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)main, level=(string)3, codec_data=(buffer)014d001effe10009674d001e9da82c049901000468ee3c80
Tags:
video codec: H.264
Codec:
video/x-h264, stream-format=(string)avc, width=(int)704, height=(int)576, framerate=(fraction)0/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, alignment=(string)au, profile=(string)main, level=(string)3, codec_data=(buffer)014d001effe10009674d001e9da82c049901000468ee3c80
Stream ID: aebffa33bac2d8fa09151f41d3e003c39b8082d07d54f00d44ccfa8a95e5cc38:1/00000100
Width: 704
Height: 576
Depth: 24
Frame rate: 0/1
Pixel aspect ratio: 1/1
Interlaced: false
Bitrate: 0
Max bitrate: 0
```
See the entire output pasted from ffmpeg tool here:
```
Input #0, mpegts, from 'http://localhost:8889/Test_4.ts':
Duration: 00:15:00.07, start: 1.000000, bitrate: 1800 kb/s
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: h264 (Main) ([27][0][0][0] / 0x001B), yuv420p(progressive), 704x576, 25 fps, 25 tbr, 90k tbn
Stream #0:1[0x101]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, mono, fltp, 71 kb/s
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2757Add protoc recipe to cerbero for gst-plugins-rs webrtc livekit signaller2023-07-04T09:39:44ZTim-Philipp Müllertim@centricular.comAdd protoc recipe to cerbero for gst-plugins-rs webrtc livekit signallerThe following discussion from !4960 should be addressed:
- [ ] @tpm started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4960#note_1986089): (+2 comments)
> Shouldn't this be added here instea...The following discussion from !4960 should be addressed:
- [ ] @tpm started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4960#note_1986089): (+2 comments)
> Shouldn't this be added here instead? https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/blob/main/ci/windows-docker/Dockerfilehttps://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/626vtenc: vtenc_h264 causing too many Redistribute latency...2023-07-03T14:19:44ZBugzilla Migration Uservtenc: vtenc_h264 causing too many Redistribute latency...## Submitted by Miki Grof-Tisza
**[Link to original bug (#789415)](https://bugzilla.gnome.org/show_bug.cgi?id=789415)**
## Description
I'm having some trouble with the pipeline:
gst-launch-1.0 --gst-debug=*:2,vtenc:4 videotestsrc ...## Submitted by Miki Grof-Tisza
**[Link to original bug (#789415)](https://bugzilla.gnome.org/show_bug.cgi?id=789415)**
## Description
I'm having some trouble with the pipeline:
gst-launch-1.0 --gst-debug=*:2,vtenc:4 videotestsrc ! videorate ! video/x-raw, format=UYVY, width=1920, height=1080, framerate=30/1 ! queue ! vtenc_h264 ! fakesink
I’m running gstreamer version 1.12.3 built from git source, on a 2017 15” Macbook Pro, w/macOS 10.12.6.
The relevant output:
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.154950000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 5 fps 30/1 time 0:00:00.166666665
Redistribute latency...
0:00:00.235169000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 6 fps 30/1 time 0:00:00.199999998
Redistribute latency...
0:00:00.241439000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 5 fps 30/1 time 0:00:00.166666665
Redistribute latency...
0:00:00.253913000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 6 fps 30/1 time 0:00:00.199999998
Redistribute latency...
0:00:00.278467000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 7 fps 30/1 time 0:00:00.233333331
Redistribute latency...
0:00:00.288046000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 6 fps 30/1 time 0:00:00.199999998
Redistribute latency...
0:00:00.371569000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 7 fps 30/1 time 0:00:00.233333331
Redistribute latency...
0:00:03.043466000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 6 fps 30/1 time 0:00:00.199999998
Redistribute latency...
0:00:03.065692000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 5 fps 30/1 time 0:00:00.166666665
Redistribute latency...
0:00:03.090887000 72088 0x7ff6d7010370 INFO vtenc vtenc.c:1070:gst_vtenc_update_latency:`<vtenc_h264-0>` latency status 0 frames 4 fps 30/1 time 0:00:00.133333332
Redistribute latency...
The vtenc_h264 element is calling gst_video_encoder_set_latency() seemingly way too often. It results in gst-launch printing "Redistribute latency..." quite often (several times per second sometimes).
What's happening seems to be the element keeping track of the underlying encoder's pending frame count. If the pending frame count ever changes (checked every frame), then it calls gst_video_encoder_set_latency().
Is it not the case that instead of tracking exact latency each frame and forcing a pipeline latency redistribution every time it changes at all, the element should just check if the latency is greater than the currently configured range (checked via call to gst_video_encoder_get_latency()) and only call gst_video_encoder_set_latency() if it's outside the range?
I will attach a patch that works for me shortly.
Version: 1.12.3https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/209osxaudiosrc drops frames when osxaudiosink misses frames2023-07-03T09:39:16ZBugzilla Migration Userosxaudiosrc drops frames when osxaudiosink misses frames## Submitted by Ilya Konstantinov
**[Link to original bug (#753112)](https://bugzilla.gnome.org/show_bug.cgi?id=753112)**
## Description
To reproduce:
$ gst-launch-1.0 osxaudiosrc ! identity drop-probability=0.1 ! osxaudiosink ...## Submitted by Ilya Konstantinov
**[Link to original bug (#753112)](https://bugzilla.gnome.org/show_bug.cgi?id=753112)**
## Description
To reproduce:
$ gst-launch-1.0 osxaudiosrc ! identity drop-probability=0.1 ! osxaudiosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstAudioSrcClock
WARNING: from element /GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0: Can't record audio fast enough
Additional debug info:
gstaudiobasesrc.c(866): GstFlowReturn gst_audio_base_src_create(GstBaseSrc *, guint64, guint, GstBuffer **) (): /GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0:
Dropped 9261 samples. This is most likely because downstream can't keep up and is consuming samples too slowly.
WARNING: from element /GstPipeline:pipeline0/GstOsxAudioSrc:osxaudiosrc0: Can't record audio fast enough
I don't quite understand why osxaudiosrc drops frames when the *sink* is the one that's not receiving all of the frames. This doesn't happen with fakesink or filesink.
Also, it's "solved" by setting "osxaudiosink sync=0".https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2737RFC: GstRtpBaseDepayload: RTP header extensions get lost when depayloader agg...2023-07-03T08:14:44ZJochen Hennebergjh@henneberg-systemdesign.comRFC: GstRtpBaseDepayload: RTP header extensions get lost when depayloader aggregates RTP packetsIn gstrtpbasedepay.c:gst_rtp_base_depayload_set_headers() calls read_rtp_header_extensions() with priv->input_buffer. In case of rtph264depay and others where several RTP buffers are aggregated before a buffer is pushed this means that o...In gstrtpbasedepay.c:gst_rtp_base_depayload_set_headers() calls read_rtp_header_extensions() with priv->input_buffer. In case of rtph264depay and others where several RTP buffers are aggregated before a buffer is pushed this means that only the header extension of the last RTP buffer used for output buffer aggregation is parsed. All other header extensions are silently ignored.
I would expect that every RTP buffer is parsed for header extensions and this should not depend on the depayloader implementation.
The header extensions that I currently implement are the full and short IV counters for HDCP over RTP (https://www.digital-cp.com/sites/default/files/HDCP%20Direct%20Adaptation%20Amendment%20Spec%20Rev2_3.pdf)
The full IV counter extension is added to the first RTP packet of a HDU (a video frame in case of video) and the short IV counter is added to all subsequent RTP packets of the same frame. On the depayloader side I only get the short IV counters if the frame is split among multiple RTP packets (which is quite likely).
As a starting point for further discussions, this is a patch that I quickly hacked up to solve the issue (and likely introduce new issues) which remembers all buffers that have been pushed to the derived class that add to the next buffer pushed on the src pad of the depayloader:
```
modified subprojects/gst-plugins-base/gst-libs/gst/rtp/gstrtpbasedepayload.c
@@ -70,6 +70,7 @@ struct _GstRTPBaseDepayloadPrivate
gboolean source_info;
GstBuffer *input_buffer;
+ GSList *unpushed_buffers;
GstFlowReturn process_flow_ret;
@@ -824,6 +825,9 @@ gst_rtp_base_depayload_handle_buffer (GstRTPBaseDepayload * filter,
priv->process_flow_ret = gst_rtp_base_depayload_push (filter, out_buf);
else
gst_buffer_unref (out_buf);
+ } else {
+ gst_buffer_ref (in);
+ priv->unpushed_buffers = g_slist_prepend (priv->unpushed_buffers, in);
}
gst_buffer_unref (in);
@@ -1289,6 +1293,7 @@ gst_rtp_base_depayload_set_headers (GstRTPBaseDepayload * depayload,
{
GstRTPBaseDepayloadPrivate *priv = depayload->priv;
GstClockTime pts, dts, duration;
+ gboolean ret = FALSE;
pts = GST_BUFFER_PTS (buffer);
dts = GST_BUFFER_DTS (buffer);
@@ -1318,10 +1323,20 @@ gst_rtp_base_depayload_set_headers (GstRTPBaseDepayload * depayload,
if (priv->source_info)
add_rtp_source_meta (buffer, priv->input_buffer);
- return read_rtp_header_extensions (depayload, priv->input_buffer, buffer);
+ priv->unpushed_buffers = g_slist_reverse (priv->unpushed_buffers);
+ for (gint n = 0; n < g_slist_length (priv->unpushed_buffers); ++n) {
+ GstBuffer *buf = g_slist_nth_data (priv->unpushed_buffers, n);
+ ret |= read_rtp_header_extensions (depayload, buf, buffer);
+ }
}
- return FALSE;
+ return ret;
+}
+
+static void
+gst_rtp_base_depayload_unpushed_buffer_unref (void * buf)
+{
+ gst_buffer_unref ((GST_BUFFER (buf)));
}
static GstFlowReturn
@@ -1335,6 +1350,9 @@ gst_rtp_base_depayload_finish_push (GstRTPBaseDepayload * filter,
GST_DEBUG_OBJECT (filter, "Pushed newsegment event on this first buffer");
}
+ g_slist_free_full (g_steal_pointer (&filter->priv->unpushed_buffers),
+ gst_rtp_base_depayload_unpushed_buffer_unref);
+
if (is_list) {
GstBufferList *blist = obj;
return gst_pad_push_list (filter->srcpad, blist);
```https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/24multifile: add support for reading filenames from a list file2023-07-02T19:34:22ZBugzilla Migration Usermultifile: add support for reading filenames from a list file## Submitted by Jonathan Matthew
**[Link to original bug (#615166)](https://bugzilla.gnome.org/show_bug.cgi?id=615166)**
## Description
Created attachment 158188
listfilesrc element
After watching someone on IRC struggling wi...## Submitted by Jonathan Matthew
**[Link to original bug (#615166)](https://bugzilla.gnome.org/show_bug.cgi?id=615166)**
## Description
Created attachment 158188
listfilesrc element
After watching someone on IRC struggling with multifilesrc for the nth time, I figured it'd be easier to use in some cases if you could give it a file containing a list of source files, rather than having to number the files sequentially.
What I'm attaching is a quick hack based on multifilesrc. It might be worth considering adding the functionality to multifilesrc rather than creating a new element, or sharing the common code (which is most of it) some other way.
**Attachment 158188**, "listfilesrc element":
[gstlistfilesrc.c](/uploads/c4b74b9c9103177a0997dec56f5841e6/gstlistfilesrc.c)https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/886rtspsrc location length problem2023-07-02T16:43:56Z末末rtspsrc location length problem`gst-launch-1.0 rtspsrc location="rtsp://183.59.160.61/PLTV/88888895/224/3221226767/00000100000000060000000000268440_0.smil?rrsip=125.88.70.140,rrsip=125.88.104.45&zoneoffset=480&icpid=szmg&accounttype=1&limitflux=-1&limitdur=-1&accounti...`gst-launch-1.0 rtspsrc location="rtsp://183.59.160.61/PLTV/88888895/224/3221226767/00000100000000060000000000268440_0.smil?rrsip=125.88.70.140,rrsip=125.88.104.45&zoneoffset=480&icpid=szmg&accounttype=1&limitflux=-1&limitdur=-1&accountinfo=%7E%7EV2.0%7Ekws5JFgGn-zqVXihML1ujg%7E9rXSI7IeLpF-Iy8tIn5TfxK58x4Z8vevaDBINizpo5FcfEW667kl4l1Ui7NwFD0d09dwbRwtTeqi-3mg1V5El-iYjXauSVB_dL9sI8VG9wg~ExtInfoWNHSPSTb+3AG0FnUkYLPMw==%3A20190726130336%2C21858133%2C183.15.205.163%2C20190726130336%2C11000100000000050000000000000148%2C2185813320190726130335%2C-1%2C0%2C1%2C%2C%2C2%2C%2C%2C%2C2%2CEND&GuardEncType=2&tenantId=8601" ! queue ! rtph264depay ! h264parse ! mpegtsmux ! hlssink target-duration=2 playlist-length=20 max-files=20 playlist-location="/www/1/1.m3u8" location="/www/1/1_%05d.ts"`
The above command reports an error because the location is too long
![image](/uploads/d9c5178377322ccd79ecde1b015bd432/image.png)
When I reduce the length of the location, it is normal, because the missing parameter is reported as 403
`gst-launch-1.0 rtspsrc location="rtsp://183.59.160.61/PLTV/88888895/224/3221226767/00000100000000060000000000268440_0.smil?rrsip=125.88.70.140,rrsip=125.88.104.45&zoneoffset=480&icpid=szmg&accounttype=1&limitflux=-1&limitdur=-1&accountinfo=%7E%7EV2.0%7Ekws5JFgGn-zqVXihML1ujg%7E9rXSI7IeLpF-Iy8tIn5TfxK58x4Z8vevaDBINizpo5FcfEW667kl4l1Ui7NwFD0d09dwbRwtTeqi-3mg1V5El-iYjXauSVB_dL9sI8VG9wg~ExtInfoWNHSPSTb+3AG0FnUkYLPMw==%3A20190726130336%2C21858133%2C183.15.205.163%2C20190726130336%2C11000100000000050000000000000148%2C2185813320190726130335%2C-1%2C0%2C1%2C%2C%2C2%2C%2C%2C%2C2%2CEND&GuardEncType=" ! queue ! rtph264depay ! h264parse ! mpegtsmux ! hlssink target-duration=2 playlist-length=20 max-files=20 playlist-location="/www/1/1.m3u8" location="/www/1/1_%05d.ts"`
![image](/uploads/bedefc78111653d45f52fe7fda402fc4/image.png)
version:1.16.2https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/633Segfault in libgdk-3 when removing twice a GtkGstGLWidget2023-07-02T16:41:33ZLink MauveSegfault in libgdk-3 when removing twice a GtkGstGLWidgetIn [Gajim](https://gajim.org), an XMPP client, we want to be able to change the video sink for video calls, but doing so multiple times leads to a crash.
Here is a testcase reproducing this issue: [gst-gtk-crash.rs](/uploads/752a6b06438...In [Gajim](https://gajim.org), an XMPP client, we want to be able to change the video sink for video calls, but doing so multiple times leads to a crash.
Here is a testcase reproducing this issue: [gst-gtk-crash.rs](/uploads/752a6b06438909c5bfaadfb7d326de38/gst-gtk-crash.rs)
It’s based on the [gtksink.rs](https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/blob/master/examples/src/bin/gtksink.rs) example with all comments removed and replaced with the ones relevant for this issue.https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/576v4l2: Incompatible pointer type compiler error on musl2023-07-02T16:40:30ZW. Michael Petullov4l2: Incompatible pointer type compiler error on muslThe musl C library defines ioctl thusly:
int ioctl (int, int, ...);
However, gstv4l2object.h defines GstV4l2Object's ioctl field as:
gint (*ioctl) (gint fd, ioctl_req_t request, ...);
and, except when `__ANDROID__` is defined...The musl C library defines ioctl thusly:
int ioctl (int, int, ...);
However, gstv4l2object.h defines GstV4l2Object's ioctl field as:
gint (*ioctl) (gint fd, ioctl_req_t request, ...);
and, except when `__ANDROID__` is defined, gstv4l2object.h defines ioctl_req_t as a gulong.
This causes a warning when building gst-plugins-good against musl:
CC libgstvideo4linux2_la-gstv4l2object.lo
gstv4l2object.c: In function 'gst_v4l2_object_new':
gstv4l2object.c:532:23: error: assignment from incompatible pointer type [-Werror=incompatible-pointer-types]
v4l2object->ioctl = ioctl;
^
cc1: all warnings being treated as errors
It looks like gstv4l2object.hhttps://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/468Not all outgoing buffers are caught while using data probes2023-07-02T16:38:28ZBugzilla Migration UserNot all outgoing buffers are caught while using data probes## Submitted by Vavooon
**[Link to original bug (#795437)](https://bugzilla.gnome.org/show_bug.cgi?id=795437)**
## Description
When I attach data probe to srcpad of `rtph264pay` element it doesn't catch all buffer the element sends....## Submitted by Vavooon
**[Link to original bug (#795437)](https://bugzilla.gnome.org/show_bug.cgi?id=795437)**
## Description
When I attach data probe to srcpad of `rtph264pay` element it doesn't catch all buffer the element sends.
I'm adding a probe with
```
gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BUFFER, (GstPadProbeCallback) cb_have_data, &m_latencyInfo, NULL);
```
and then writing down buffer info:
```
static GstPadProbeReturn
cb_have_data (GstPad *pad,
GstPadProbeInfo *info,
gpointer user_data) {
GstStructure *stats;
guint mtu, seqnum, timestamp;
gint seqnumOffset;
g_object_get (G_OBJECT (element), "stats", &stats, NULL);
g_object_get (G_OBJECT (element), "mtu", &mtu, NULL);
g_object_get (G_OBJECT (element), "seqnum", &seqnum, NULL);
g_object_get (G_OBJECT (element), "seqnum-offset", &seqnumOffset, NULL);&seqnum);
gst_structure_get_uint(stats, "timestamp", ×tamp);
log_debug("seqnum: % " PRIu32 " seqnum-offset: %" PRId32 " ts: %" PRIu32 " mtu: %" PRIu32 " size: %" PRId32,
seqnum, seqnumOffset, timestamp, mtu, gst_buffer_get_size(buffer));
}
```
With default MTU value 1400 it shows only small amount of sent buffers (or packets):
```
seqnum: 0 seqnum-offset: 0 ts: 2467070291 mtu: 1400 size: 14
seqnum: 1 seqnum-offset: 0 ts: 2467070291 mtu: 1400 size: 23
seqnum: 2 seqnum-offset: 0 ts: 2467070291 mtu: 1400 size: 16
seqnum: 163 seqnum-offset: 0 ts: 2467070291 mtu: 1400 size: 14
seqnum: 164 seqnum-offset: 0 ts: 2467070291 mtu: 1400 size: 23
seqnum: 165 seqnum-offset: 0 ts: 2467070291 mtu: 1400 size: 16
seqnum: 326 seqnum-offset: 0 ts: 2467160291 mtu: 1400 size: 14
seqnum: 327 seqnum-offset: 0 ts: 2467160291 mtu: 1400 size: 23
seqnum: 328 seqnum-offset: 0 ts: 2467160291 mtu: 1400 size: 16
seqnum: 489 seqnum-offset: 0 ts: 2467250291 mtu: 1400 size: 14
seqnum: 490 seqnum-offset: 0 ts: 2467250291 mtu: 1400 size: 23
seqnum: 491 seqnum-offset: 0 ts: 2467250291 mtu: 1400 size: 16
seqnum: 652 seqnum-offset: 0 ts: 2467340291 mtu: 1400 size: 14
seqnum: 653 seqnum-offset: 0 ts: 2467340291 mtu: 1400 size: 23
seqnum: 654 seqnum-offset: 0 ts: 2467340291 mtu: 1400 size: 16
seqnum: 815 seqnum-offset: 0 ts: 2467430291 mtu: 1400 size: 14
```
and doesn't show main packets with encoded video, only SPS/PPS ones (probably).
However all packets are displayed when I set MTU to value bigger than max packet size it produces:
```
seqnum: 0 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 14
seqnum: 1 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 23
seqnum: 2 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 16
seqnum: 3 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 220525
seqnum: 4 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 14
seqnum: 5 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 23
seqnum: 6 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 16
seqnum: 7 seqnum-offset: 0 ts: 154585161 mtu: 957712 size: 220784
seqnum: 8 seqnum-offset: 0 ts: 154675161 mtu: 957712 size: 14
seqnum: 9 seqnum-offset: 0 ts: 154675161 mtu: 957712 size: 23
seqnum: 10 seqnum-offset: 0 ts: 154675161 mtu: 957712 size: 16
seqnum: 11 seqnum-offset: 0 ts: 154675161 mtu: 957712 size: 220781
seqnum: 12 seqnum-offset: 0 ts: 154765161 mtu: 957712 size: 14
seqnum: 13 seqnum-offset: 0 ts: 154765161 mtu: 957712 size: 23
seqnum: 14 seqnum-offset: 0 ts: 154765161 mtu: 957712 size: 16
seqnum: 15 seqnum-offset: 0 ts: 154765161 mtu: 957712 size: 220575
seqnum: 16 seqnum-offset: 0 ts: 154855161 mtu: 957712 size: 14
```
Version: 1.14.0https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/442splitmuxsink: add a property "location-format-type" to allow filenames with v...2023-07-02T16:37:01ZBugzilla Migration Usersplitmuxsink: add a property "location-format-type" to allow filenames with values other than fragment index## Submitted by Abhinav
**[Link to original bug (#793681)](https://bugzilla.gnome.org/show_bug.cgi?id=793681)**
## Description
At present, location property only accepts "%d" format specifier which represents fragment-index. In many...## Submitted by Abhinav
**[Link to original bug (#793681)](https://bugzilla.gnome.org/show_bug.cgi?id=793681)**
## Description
At present, location property only accepts "%d" format specifier which represents fragment-index. In many cases, it would be helpful to allow specifying other values such as first buffer PTS in filenames.
An additional property for splitmuxsink, called "location-format-type" can be added to specify value type represented by format specifier in location string pattern. For e.g. if user wants to have first buffer's time in filenames, he should be able to specify the same by following property values :
e.g.
max-size-time=120000000000
location=video_%Y-%m-%d_%H:%M:%S
location-format-type=2
Where, location-format-type can specify different format types such as
(1): fragment-index - Fragment Index(Default)
(2): first-buffer-pts - First Buffer PTS
.. and so on
Note:
In plugin version 1.12, a new signal location-format-full was added, which is flexible to support above mentioned requirement, but it requires users to do signal handling stuff. While "first buffer's time" as part of filenames, is a very common requirement, it should be possible to specify by simple configuration, than complex signal handling stuff. This should justify above enhancement proposal.
Version: 1.10.xhttps://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/419splitmuxsink hangs sometime2023-07-02T16:34:43ZBugzilla Migration Usersplitmuxsink hangs sometime## Submitted by Alex
**[Link to original bug (#790976)](https://bugzilla.gnome.org/show_bug.cgi?id=790976)**
## Description
I try to use next pipeline
gst-launch-1.0 rtspsrc location=rtsp://admin:admin@192.168.1.108 ! rtph264de...## Submitted by Alex
**[Link to original bug (#790976)](https://bugzilla.gnome.org/show_bug.cgi?id=790976)**
## Description
I try to use next pipeline
gst-launch-1.0 rtspsrc location=rtsp://admin:admin@192.168.1.108 ! rtph264depay ! h264parse ! tee name=tee0 ! queue ! splitmuxsink location=chain.%d.mp4 tee0. ! queue ! decodebin ! autovideosink
And in most cases it hangs (but sometime works).
After debugging I found that it hangs on the
"GST_SPLITMUX_WAIT (splitmux);"
in the function "complete_or_wait_on_out" of the "splitmuxsink"
Version: 1.12.x