GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2023-09-19T08:41:42Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2910gtkwaylandsink: cancel pending redraw callback on pause/resume2023-09-19T08:41:42ZHugues Fruchetgtkwaylandsink: cancel pending redraw callback on pause/resume### Describe your issue
Gtk-based application freeze have been observed when doing intensive pause/resume on video playback or camera preview.
#### Setup
* STM32MP25-EV evaluation board
* GStreamer-1.22.3
### How reproducible is the bu...### Describe your issue
Gtk-based application freeze have been observed when doing intensive pause/resume on video playback or camera preview.
#### Setup
* STM32MP25-EV evaluation board
* GStreamer-1.22.3
### How reproducible is the bug?
Launch Gtk player demo app provided with ST image (the moving ball on blue background is observed):
<pre>
root@stm32mp25:~# /usr/local/demo/bin/touch-event-gtk-player
</pre>
Then single-tap on touchscreen to pause/resume several time till all is frozen.
At this stage this is not possible to exit from app using double-tap on touchscreen.
Only way to exit is CTRL+C
<pre>
root@stm32mp25:~# GST_DEBUG=gtkway*:3,*GST_STATES*:9 /usr/local/demo/bin/touch-event-gtk-player
</pre>
```
0:00:13.358123558 2131 0x2722c90 ERROR gtkwaylandsink gstgtkwaylandsink.c:713:gst_gtk_wayland_sink_change_state:<gtkwsink> > gst_gtk_wayland_sink_change_state
0:00:13.691456850 2131 0x2722c90 ERROR gtkwaylandsink gstgtkwaylandsink.c:744:gst_gtk_wayland_sink_change_state:<gtkwsink> 1 gst_gtk_wayland_sink_change_state
0:00:13.691524750 2131 0x2722c90 LOG GST_STATES gstelement.c:3292:gst_element_change_state_func:<gtkwsink> default handler tries setting state from PLAYING to PAUSED (0023)
0:00:13.693506664 2131 0x2722c90 DEBUG GST_STATES gstbin.c:3351:bin_handle_async_start:<pipeline0> state change busy
0:00:13.693591765 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3110:gst_element_change_state:<gtkwsink> element will change state ASYNC
0:00:13.693636515 2131 0x2722c90 LOG GST_STATES gstelement.c:3150:gst_element_change_state:<gtkwsink> exit async state change 2
0:00:13.693734391 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3051:gst_element_set_state_func:<gtkwsink> returned ASYNC
==============================> gtkwaylandsink return ASYNC here and never complete to PAUSED <===========================
0:00:13.693780116 2131 0x2722c90 INFO GST_STATES gstbin.c:2942:gst_bin_change_state_func:<pipeline0> child 'gtkwsink' is changing state asynchronously to PAUSED
0:00:13.693840567 2131 0x2722c90 INFO GST_STATES gstbin.c:2479:gst_bin_element_set_state:<videotestsrc0> current PLAYING pending VOID_PENDING, desired next PAUSED
0:00:13.693886117 2131 0x2722c90 DEBUG GST_STATES 91eaaac464e7641de6acae13c2ff36782629d786gstelement.c:2967:gst_element_set_state_func:<videotestsrc0> set_state to PAUSED
0:00:13.693924167 2131 0x2722c90 DEBUG GST_STATES gstelement.c:2992:gst_element_set_state_func:<videotestsrc0> setting target state to PAUSED
0:00:13.693970417 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3001:gst_element_set_state_func:<videotestsrc0> current PLAYING, old_pending VOID_PENDING, next VOID_PENDING, old return SUCCESS
0:00:13.694013443 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3037:gst_element_set_state_func:<videotestsrc0> final: setting state from PLAYING to PAUSED
0:00:13.694062118 2131 0x2722c90 LOG GST_STATES gstelement.c:3292:gst_element_change_state_func:<videotestsrc0> default handler tries setting state from PLAYING to PAUSED (0023)
0:00:13.694100393 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3128:gst_element_change_state:<videotestsrc0> element changed state SUCCESS
0:00:13.694138119 2131 0x2722c90 INFO GST_STATES gstelement.c:2816:gst_element_continue_state:<videotestsrc0> completed state change to PAUSED
0:00:13.694180294 2131 0x2722c90 INFO GST_STATES gstelement.c:2716:_priv_gst_element_state_changed:<videotestsrc0> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending)
0:00:13.694248519 2131 0x2722c90 LOG GST_STATES gstelement.c:3145:gst_element_change_state:<videotestsrc0> exit state change 1
0:00:13.694288445 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3051:gst_element_set_state_func:<videotestsrc0> returned SUCCESS
0:00:13.694331270 2131 0x2722c90 INFO GST_STATES gstbin.c:2935:gst_bin_change_state_func:<pipeline0> child 'videotestsrc0' changed state to 3(PAUSED) successfully
0:00:14.027793538 2131 0x2722c90 DEBUG GST_STATES gstbin.c:2997:gst_bin_change_state_func:<pipeline0> iterator done
0:00:14.027864139 2131 0x2722c90 LOG GST_STATES gstelement.c:3292:gst_element_change_state_func:<pipeline0> default handler tries setting state from PLAYING to PAUSED (0023)
0:00:14.027905664 2131 0x2722c90 DEBUG GST_STATES gstbin.c:3013:gst_bin_change_state_func:<pipeline0> we have ASYNC elements SUCCESS -> ASYNC
0:00:13.698795726 2131 0x2365300 INFO GST_STATES gstelement.c:2688:gst_element_abort_state:<gtkwsink> aborting state from PLAYING to PAUSED
0:00:14.027965964 2131 0x2722c90 DEBUG GST_STATES gstbin.c:3061:gst_bin_change_state_func:<pipeline0> done changing bin's state from PLAYING to PAUSED, now in PLAYING, ret ASYNC
0:00:14.028064440 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3110:gst_element_change_state:<pipeline0> element will change state ASYNC
0:00:14.028104065 2131 0x2722c90 LOG GST_STATES gstelement.c:3150:gst_element_change_state:<pipeline0> exit async state change 2
0:00:14.028140941 2131 0x2722c90 DEBUG GST_STATES gstelement.c:3051:gst_element_set_state_func:<pipeline0> returned ASYNC
--> SIMPLE TAP
--> BEGIN diff = 1199
Got qos message from /GstPipeline:pipeline0/GstGtkWaylandSink:gtkwsink
[...]
Got state-changed message from /GstPipeline:pipeline0/GstGtkWaylandSink:gtkwsink
Got state-changed message from /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0
Got state-changed message from /GstPipeline:pipeline0
new state: GST_STATE_PLAYING
Got state-changed message from /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0
0:00:14.733821170 2131 0x2722c90 INFO GST_STATES gstbin.c:2069:gst_bin_get_state_func:<pipeline0> getting state
0:00:14.733914821 2131 0x2722c90 DEBUG GST_STATES gstelement.c:2504:gst_element_get_state_func:<pipeline0> getting state, timeout 99:99:99.999999999
0:00:14.733948471 2131 0x2722c90 DEBUG GST_STATES gstelement.c:2509:gst_element_get_state_func:<pipeline0> RETURN is ASYNC
0:00:14.733976421 2131 0x2722c90 INFO GST_STATES gstelement.c:2532:gst_element_get_state_func:<pipeline0> waiting for element to commit state
==============================> FROZEN HERE <===========================
```
### Solutions you have tried
Traces & analysis shows that gtkwaylandsink element is stalled on state change
from PLAYING to PAUSED or PAUSED to PLAYING.
There is a first problem when ASYNC state change is returned by parent
class; in this case we prematurely return from state_change, fix that by
checking GST_STATE_CHANGE_FAILURE instead:
<pre>
diff --git a/ext/gtk/gstgtkwaylandsink.c b/ext/gtk/gstgtkwaylandsink.c
@@ -736,7 +736,7 @@ gst_gtk_wayland_sink_change_state (GstElement * element,
}
ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition);
- if (ret != GST_STATE_CHANGE_SUCCESS)
+ if (ret == GST_STATE_CHANGE_FAILURE)
return ret;
</pre>
=> This code has been taken from waylandsink which more or less share the same code here.
The other problem is about eventual pending wayland listener callback
(see redraw_callback mechanism) while changing state which cause
further freeze of element. Fix this by destroying listener callback
and resetting redraw_callback flag when going from PLAYING to PAUSED and
PAUSED to PLAYING.
<pre>
diff --git a/ext/gtk/gstgtkwaylandsink.c b/ext/gtk/gstgtkwaylandsink.c
switch (transition) {
@@ -762,6 +762,18 @@ gst_gtk_wayland_sink_change_state (GstElement * element,
priv->redraw_pending = FALSE;
g_mutex_unlock (&priv->render_lock);
break;
+ case GST_STATE_CHANGE_PAUSED_TO_PLAYING:
+ case GST_STATE_CHANGE_PLAYING_TO_PAUSED:
+ /* Destroy pending redraw callback otherwise
+ * element may freeze */
+ g_mutex_lock (&priv->render_lock);
+ if (priv->callback) {
+ wl_callback_destroy (priv->callback);
+ priv->callback = NULL;
+ }
+ priv->redraw_pending = FALSE;
+ g_mutex_unlock (&priv->render_lock);
+ break;
default:
break;
}
</pre>https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2886videorate: hits assert with drop-only and variable output framerate2023-08-08T09:05:18ZPhilippe Normandvideorate: hits assert with drop-only and variable output framerate```
gst-launch-1.0 videotestsrc do-timestamp=1 num-buffers=5 ! videorate drop-only=true ! video/x-raw,framerate=0/1 ! queue ! fakesink
ERROR:../subprojects/gst-plugins-base/gst/videorate/gstvideorate.c:745:gst_video_rate_push_buffer: ass...```
gst-launch-1.0 videotestsrc do-timestamp=1 num-buffers=5 ! videorate drop-only=true ! video/x-raw,framerate=0/1 ! queue ! fakesink
ERROR:../subprojects/gst-plugins-base/gst/videorate/gstvideorate.c:745:gst_video_rate_push_buffer: assertion failed: (GST_BUFFER_DURATION_IS_VALID (outbuf))
Bail out! ERROR:../subprojects/gst-plugins-base/gst/videorate/gstvideorate.c:745:gst_video_rate_push_buffer: assertion failed: (GST_BUFFER_DURATION_IS_VALID (outbuf))
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2880x264: Missing ROI driven QP adjustment2023-08-03T19:31:21ZDhaval Sutariax264: Missing ROI driven QP adjustment### Describe your issue
For ROI based QP value, we are using "gst_buffer_add_video_region_of_interest_meta" and "gst_video_region_of_interest_meta_add_param" APIs.
But after calling this APIs, before x264enc we are not getting expected b...### Describe your issue
For ROI based QP value, we are using "gst_buffer_add_video_region_of_interest_meta" and "gst_video_region_of_interest_meta_add_param" APIs.
But after calling this APIs, before x264enc we are not getting expected behaviour.
#### Expected Behavior
Region selected should looks different than other region. Like if qp is high that ROI should look blur.
#### Observed Behavior
No different between ROI and non ROI area.
#### Setup
Linux 18.04https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2861playsink: Integrate bayer2rgb and dsdconvert2023-07-27T12:22:20ZCarlos Rafael Gianiplaysink: Integrate bayer2rgb and dsdconvertFollowing the discussions from [the DSD merge request](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3901) and [this issue about DSD support](https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/972), ...Following the discussions from [the DSD merge request](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3901) and [this issue about DSD support](https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/972), fundamental DSD support is in place. However, DSD bits can be grouped into different formats. For example, DSDU32LE groups bits into 32-bit words. (These are not sample formats! See [GstDsdInfo](https://gstreamer.freedesktop.org/documentation//audio/gstdsd.html?gi-language=c#GstDsdInfo) for details about this.) If for example an ALSA device can handle DSDU32LE, but the incoming data has its bits grouped as DSDU16BE, then a conversion is needed, otherwise there is a not-negotiated error.
The `dsdconvert` element uses [gst_dsd_convert](https://gstreamer.freedesktop.org/documentation//audio/gstdsd.html?gi-language=c#gst_dsd_convert) to take care of this conversion. However, it needs to be integrated into playsink. Once that is done, DSD is fully covered by playbin and playsink - autoplugging would insert DSF / DFF parsers, and would detect if the downstream audio sink can handle DSD or not, inserting DSD->PCM elements if necessary. And, in case of hardware that can directly handle DSD, it would convert between grouping formats (DSDU8, DSDU32LE etc.) as needed.
Also, similarly, Bayer -> RGB conversion is needed to be able to fully support Bayer graphics out of the box. The `bayer2rgb` element needs to be integrated into `playsink` for this purpose.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2814AESdec wav file2023-07-13T16:57:46ZDan SirbuAESdec wav fileI'm trying to use AES enc/dec with wav. I have run the following pipeline to encrypt a wav file:
gst-launch-1.0 filesrc location=./convo2_long.raw ! rawaudioparse format=mulaw sample-rate=8000 num-channels=1 use-sink-caps =false ! mulaw...I'm trying to use AES enc/dec with wav. I have run the following pipeline to encrypt a wav file:
gst-launch-1.0 filesrc location=./convo2_long.raw ! rawaudioparse format=mulaw sample-rate=8000 num-channels=1 use-sink-caps =false ! mulawdec ! audioconvert ! capsfilter caps=audio/x-raw,format=S16LE ! wavenc ! aesenc key=1f9423681beb9a79215820f6bda73d0f iv=e9aa8e834d8d70 b7e0d254ff670dd718 per-buffer-padding=false ! filesink location=./convo2_long_enc.wav
And based on the 'log_aesenc' logs I think it is doing what is supposed to.
Then, I tried to decode the wav file using:
gst-launch-1.0 filesrc location=./convo2_long_enc.wav ! aesdec key=1f9423681beb9a79215820f6bda73d0f iv=e9aa8e834d8d70b7e0d25 4ff670dd718 per-buffer-padding=false ! filesink location=./convo2_long_dec.wav
but it fails with:
ERROR: from element /GstPipeline:pipeline0/GstAesDec:aesdec0: Cipher finalization failed. Additional debug info: ../ext/aes/gstaesdec.c(416): gst_aes_dec_sink_event (): /GstPipeline:pipeline0/GstAesDec:aesdec0: Error while finalizing the stream
I do not get any extra details in the log file.
GST_DEBUG="2,wavenc:7,aesenc:7,aesdec:7" GST_DEBUG_FILE="./log.txt"[log_aesdec.txt](/gstreamer/gst-build/uploads/972a382ed909fc7e5f1c96a40d66a23f/log_aesdec.txt)
Do I do something wrong or it is a bug ?[log_aesenc.txt](/uploads/b875a3aec1ae0139e6c788db1705f2d2/log_aesenc.txt)[log_aesdec.txt](/uploads/26dab15880f4ef00d6756b99cf8138a6/log_aesdec.txt)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2809Publish gst-python on PyPI2023-07-18T13:27:06ZJames HenstridgePublish gst-python on PyPIThe common way to pull in dependencies for a Python project is to install them via `pip`: either directly, or via an `install_requires` stanza in `setup.py`. This requires the dependency to be available on [PyPI](https://pypi.org/). As...The common way to pull in dependencies for a Python project is to install them via `pip`: either directly, or via an `install_requires` stanza in `setup.py`. This requires the dependency to be available on [PyPI](https://pypi.org/). As gst-python is not available on PyPI, it can't be pulled in in the normal way.
I realise that the package is somewhat special, requiring various GStreamer headers and introspection data be installed. But if [PyGObject](https://pypi.org/project/PyGObject/) can do it, presumably gst-python can too.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2802mpegtsmux: mux private data streams2023-10-01T20:31:56ZBugzilla Migration Usermpegtsmux: mux private data streams## Submitted by Martijn Grendelman
**[Link to original bug (#673582)](https://bugzilla.gnome.org/show_bug.cgi?id=673582)**
## Description
Mpegtsdemux is capable of demuxing private data streams, like subtitles and teletext, but mpeg...## Submitted by Martijn Grendelman
**[Link to original bug (#673582)](https://bugzilla.gnome.org/show_bug.cgi?id=673582)**
## Description
Mpegtsdemux is capable of demuxing private data streams, like subtitles and teletext, but mpegtsmux is not capable of muxing them. So, if you want to transcode a TS, but keep all non-audio/video streams as-is, this is not possible.
This is a request to add private data stream support to mpegtsmux.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/383fallbacksrc: support multiple video and audio tracks2023-08-24T23:17:25ZJuan David Adarvefallbacksrc: support multiple video and audio tracksI'm using the [`fallbacksrc` element](https://gstreamer.freedesktop.org/documentation/fallbackswitch/fallbacksrc.html) to consume two SRT sources as:
```bash
export GST_DEBUG_DUMP_DOT_DIR=/tmp
gst-launch-1.0 fallbacksrc uri="srt://SOME_...I'm using the [`fallbacksrc` element](https://gstreamer.freedesktop.org/documentation/fallbackswitch/fallbacksrc.html) to consume two SRT sources as:
```bash
export GST_DEBUG_DUMP_DOT_DIR=/tmp
gst-launch-1.0 fallbacksrc uri="srt://SOME_IP:10001?mode=caller" fallback-uri="srt://SOME_IP:10005?mode=caller" timeout=500000000 name="decodebin" \
decodebin.video ! queue ! fakesink \
decodebin.audio ! queue ! fakesink
```
The `fallbacksrc` works as expected in this case.
However, if my SRT sources contain more than one video or audio track, I'm not able to collect the decoded data out of the `fallbacksrc` as there are no `video_%u` nor `audio_%u` pads I could use to request the data. Ideally, I would like:
```bash
export GST_DEBUG_DUMP_DOT_DIR=/tmp
gst-launch-1.0 fallbacksrc uri="srt://SOME_IP:10001?mode=caller" fallback-uri="srt://SOME_IP:10005?mode=caller" timeout=500000000 name="decodebin" \
decodebin.video_0 ! queue ! fakesink \
decodebin.video_1 ! queue ! fakesink \
decodebin.video_2 ! queue ! fakesink \
decodebin.video_3 ! queue ! fakesink \
decodebin.video_4 ! queue ! fakesink \
decodebin.audio_0 ! queue ! fakesink \
decodebin.audio_1 ! queue ! fakesink
```
Here's a diagram with the pipeline when there are 5 video and 2 audio tracks. The underlying `uridecodebin3` is detecting all the tracks, but it is decoding `video_0` and `audio_0` only.
![pipeline.svg](/uploads/ecbcde8377f1fe03873d6e7de404fe84/pipeline.svg)
The enhancement I propose is to expose all available tracks as source pads of the `fallbacksrc` element as `audio_%u`, `video_%u`. Looking at the documentation of the element, the pad templates are `audio` and `video`, without the typical `_%u` suffix of multiple pads. Not sure if they can be modified without breaking the current element API, or if this will require a new `fallbacksrc2` element.
I'm happy to contribute to enhancing the element if needed.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2748mfvideosrc does not connect at startup, but ksvideosrc does2023-07-05T22:08:36ZTim Williamsmfvideosrc does not connect at startup, but ksvideosrc doesI have a Python application that connects to various cameras. I'm using pyqtgraph (PyQt5) for the GUI.
I'm also reading from an INI file that reads in the properties for the videosrc element. When I use `ksvideosrc`, I can connect to the...I have a Python application that connects to various cameras. I'm using pyqtgraph (PyQt5) for the GUI.
I'm also reading from an INI file that reads in the properties for the videosrc element. When I use `ksvideosrc`, I can connect to the element and set it to playing with no problem. With `mfvideosrc`, I have to select if from my dropdown list of available videosrc elements.
Since `kvideosrc` is supposed to be deprecated, I'd like to get `mfvideosrc` connected when I first start my program.
Thanks for any help.
INI file sections with the 2 videosrc elements:
```
[[[params]]]
name = ksvideosrc
blocksize = 4096
num-buffers = -1
typefind = False
do-timestamp = False
device-path =
device-name =
device-index = -1
do-stats = False
fps = -1
enable-quirks = True
```
```
[[[params]]]
name = mfvideosrc
blocksize = 4096
num-buffers = -1
typefind = False
do-timestamp = False
device-path =
device-name =
device-index = -1
```
I'd like to be able to connect to mfvideosrc after reading my INI file, without having to reselect it, as selecting from the dropdown list just get the default properties that I have to then change (like `device-index`).
Trying to connect (out of the box):
```
0:00:07.416300000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:07.416782000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:07.420407000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:07.420892000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:07.425210000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:07.431198000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:07.431897000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
2023-07-01 20:23:53,610.610: (INFO) pipeline:open: mfvideosrc: Setting pipeline to playing state
0:00:07.446474000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:270:gst_mf_video_src_start:<mfvideosrc> Start
0:00:07.462567000 184740 000002914FE9BB10 ERROR mfvideosrc gstmfvideosrc.cpp:288:gst_mf_video_src_start:<mfvideosrc> Couldn't create capture object
2023-07-01 20:23:53,630.630: (ERROR) camera:open_camera: Camera 0: failed to connect to mfvideosrc
0:00:07.464518000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:270:gst_mf_video_src_start:<mfvideosrc> Start
0:00:07.466875000 184740 000002914FE9BB10 ERROR mfvideosrc gstmfvideosrc.cpp:288:gst_mf_video_src_start:<mfvideosrc> Couldn't create capture object
```
After selecting `mfvideosrc` from dropdown menu - first select `ksvideosrc`, then `mfvideosrc`.:
```
** (python.exe:184740): WARNING **: 20:23:58.695: "ksvideosrc" is deprecated and will be removedin the future. Use "mfvideosrc" element instead
0:00:15.923171000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:15.923836000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:15.924526000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:15.924969000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:15.926285000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:15.926882000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
0:00:15.927327000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string){ BGRx, BGRA, BGR, RGB15, RGB16, VUYA, YUY2, YVYU, UYVY, NV12, YV12, I420, P010, P016, v210, v216, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; image/jpeg, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
2023-07-01 20:24:02,099.099: (INFO) pipeline:open: mfvideosrc: Setting pipeline to playing state
0:00:15.933589000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:270:gst_mf_video_src_start:<mfvideosrc> Start
0:00:16.291420000 184740 0000029159D022F0 DEBUG mfvideosrc gstmfvideosrc.cpp:358:gst_mf_video_src_get_caps:<mfvideosrc> Returning caps video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1; video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1; video/x-raw, format=(string)NV12, width=(int)640, height=(int)360, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1; video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)10/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:0:1, chroma-site=(string)dv; video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:0:1, chroma-site=(string)dv; video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)2:4:0:1, chroma-site=(string)dv; image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1; image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1; image/jpeg, width=(int)640, height=(int)360, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1
0:00:16.317432000 184740 0000029159D022F0 DEBUG mfvideosrc gstmfvideosrc.cpp:320:gst_mf_video_src_set_caps:<mfvideosrc> Set caps image/jpeg, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:4:0:1
0:00:16.678440000 184740 0000029159D022F0 DEBUG mfvideosrc gstmfvideosrc.cpp:509:gst_mf_video_src_create:<mfvideosrc> Updated latency value 0:00:00.006439700
2023-07-01 20:24:02,969.969: (INFO) camera:open_camera: Camera 0: connected to mfvideosrc
2023-07-01 20:24:03,092.092: (WARNING) pipeline:on_timer: capturesink: <flags GST_MESSAGE_LATENCY of type Gst.MessageType>: (no Gst.Structure)
2023-07-01 20:24:03,094.094: (WARNING) pipeline:on_timer: viewsink: <flags GST_MESSAGE_LATENCY of type Gst.MessageType>: (no Gst.Structure)
2023-07-01 20:24:03,097.097: (WARNING) pipeline:on_timer: capturesink: <flags GST_MESSAGE_LATENCY of type Gst.MessageType>: (no Gst.Structure)
2023-07-01 20:24:03,098.098: (WARNING) pipeline:on_timer: viewsink: <flags GST_MESSAGE_LATENCY of type Gst.MessageType>: (no Gst.Structure)
2023-07-01 20:24:03,100.100: (WARNING) pipeline:on_timer: viewsink: <flags GST_MESSAGE_LATENCY of type Gst.MessageType>: (no Gst.Structure)
2023-07-01 20:24:03,101.101: (WARNING) pipeline:on_timer: capturesink: <flags GST_MESSAGE_LATENCY of type Gst.MessageType>: (no Gst.Structure)
2023-07-01 20:24:06,458.458: (INFO) pipeline:close: mfvideosrc: Setting pipeline to null state
0:00:20.293125000 184740 000002914FE9BB10 DEBUG mfvideosrc gstmfvideosrc.cpp:302:gst_mf_video_src_stop:<mfvideosrc> Stop
2023-07-01 20:24:06,722.722: (INFO) camera:close_camera: Camera 0: disconnected
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2742qtdemux: Support chapters and provide a GstToc2023-07-01T19:20:38ZBugzilla Migration Userqtdemux: Support chapters and provide a GstToc## Submitted by Bastien Nocera `@hadess`
**[Link to original bug (#540887)](https://bugzilla.gnome.org/show_bug.cgi?id=540887)**
## Description
There's currently no chapters support in qtdemux. This could be used to browse in files ...## Submitted by Bastien Nocera `@hadess`
**[Link to original bug (#540887)](https://bugzilla.gnome.org/show_bug.cgi?id=540887)**
## Description
There's currently no chapters support in qtdemux. This could be used to browse in files such as:
http://downloads.oreilly.com/make/MAKE_2005-07-18.m4b
### Depends on
* [Bug 540890](https://bugzilla.gnome.org/show_bug.cgi?id=540890)
### Blocking
* [Bug 163546](https://bugzilla.gnome.org/show_bug.cgi?id=163546)
* [Bug 328298](https://bugzilla.gnome.org/show_bug.cgi?id=328298)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2710vaapi: wrong colors on mkv h264 videos2023-06-28T16:22:02ZMattiavaapi: wrong colors on mkv h264 videosGstreamer-based media player like _totem_ (gnome videos) and _clapper_
paint wrong colours in video playback;
I double-checked it with _mpv_ and _vlc_;
Arch Linux w/ gstreamer 1.22.4-1 on GNOME 44.2 and Wayland
file video propert...Gstreamer-based media player like _totem_ (gnome videos) and _clapper_
paint wrong colours in video playback;
I double-checked it with _mpv_ and _vlc_;
Arch Linux w/ gstreamer 1.22.4-1 on GNOME 44.2 and Wayland
file video properties, from `mediainfo`:
```
Video
ID : 1
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High@L4
Format settings : CABAC / 4 Ref Frames
Format settings, CABAC : Yes
Format settings, Reference frames : 4 frames
Codec ID : V_MPEG4/ISO/AVC
Duration : 1 h 0 min
Bit rate mode : Variable
Bit rate : 7 194 kb/s
Maximum bit rate : 15.0 Mb/s
Width : 1 920 pixels
Height : 1 080 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 23.976 (24000/1001) FPS
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 0.145
Stream size : 3.02 GiB (90%)
Language : English
Default : Yes
Forced : No
Color range : Limited
Color primaries : BT.709
Transfer characteristics : BT.709
Matrix coefficients : BT.709
```
Below, screenshots;
totem:
![Screenshot_from_2023-06-25_10-48-40](/uploads/a12b51de65aa5de83e9d8ab2c8f92f69/Screenshot_from_2023-06-25_10-48-40.png)
mpv:
![Screenshot_from_2023-06-25_10-48-44](/uploads/0cf129a272d1de80b6726389a432e6f3/Screenshot_from_2023-06-25_10-48-44.png)
vlc:
![Screenshot_from_2023-06-25_10-54-41](/uploads/20cbb6a5a6d988f13fde0f714dd2a901/Screenshot_from_2023-06-25_10-54-41.png)
clapper:
![Screenshot_from_2023-06-25_11-18-09](/uploads/6fe195bd913785d27e1a01ec5d228ca1/Screenshot_from_2023-06-25_11-18-09.png)
_PS: I am not an expert but I will stay here to help and dig the issue as I can_https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2703`Queued GOP time is negative` error in splitmuxsink for rtspsrc2024-02-07T10:31:34ZZiad Hatahet`Queued GOP time is negative` error in splitmuxsink for rtspsrcWe're hitting this error when running a GStreamer pipeline to capture HLS video segments:
```
ERROR: from element /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink: Timestamping error on input streams
Additional debug info:
../gst/mul...We're hitting this error when running a GStreamer pipeline to capture HLS video segments:
```
ERROR: from element /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink: Timestamping error on input streams
Additional debug info:
../gst/multifile/gstsplitmuxsink.c(2594): handle_gathered_gop (): /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink:
Queued GOP time is negative -0:00:00.035816292
Execution ended after 0:22:00.008870101
```
The pipeline that produced this error is
```
$ gst-launch-1.0 rtspsrc location=rtsp://$USER:$PASSWORD@$IP is-live=true protocols=tcp \
! rtph264depay wait-for-keyframe=true request-keyframe=true \
! h264parse \
! splitmuxsink name=splitmuxsink max-size-time=10000000000 send-keyframe-requests=true muxer=mpegtsmux location=segment%05d.ts
```
There should not be any errors when running the pipeline. I tried setting `config-interval=-1` on the `h264parse` node, but the issue persists.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2669gst-python: When using custom BaseTransform, pad_template != NULL error occurred2024-01-22T06:59:17Z황현동gst-python: When using custom BaseTransform, pad_template != NULL error occurredI am developing a custom element using gst-python.<br/>
I inherited BaseTransform as shown below, created a custom element, added it to the pipeline, and linked it, but an error occurred.<br/>
<br/>
<br/>
* environment :
```bash
❯ sw...I am developing a custom element using gst-python.<br/>
I inherited BaseTransform as shown below, created a custom element, added it to the pipeline, and linked it, but an error occurred.<br/>
<br/>
<br/>
* environment :
```bash
❯ sw_vers -productName
macOS
❯ sw_vers -productVersion
13.4
❯ brew info gstreamer
==> gstreamer: stable 1.22.3 (bottled), HEAD
❯ brew info gst-python
==> gst-python: stable 1.20.5 (bottled)
```
<br/>
<br/>
* Source code:
```python
class MyTransform(GstBase.BaseTransform):
__gstmetadata__ = ('CustomTransform', 'Transform', \
'A custom transform element', 'Author Name')
__gsttemplates__ = (Gst.PadTemplate.new("src",
Gst.PadDirection.SRC,
Gst.PadPresence.ALWAYS,
Gst.Caps.from_string("video/x-raw,format=(string)RGB")),
Gst.PadTemplate.new("sink",
Gst.PadDirection.SINK,
Gst.PadPresence.ALWAYS,
Gst.Caps.from_string("video/x-raw,format=(string)RGB")))
def do_transform_ip(self, buf):
# Do something with the buffer in-place
# For simplicity, we just print the buffer size and return
print("Buffer size: ", buf.get_size())
return Gst.FlowReturn.OK
GObject.type_register(MyTransform)
__gstelementfactory__ = ("MyTransform", Gst.Rank.NONE, MyTransform)
def main():
# Build the pipeline
src = Gst.ElementFactory.make("videotestsrc", "src")
sink = Gst.ElementFactory.make("autovideosink", "sink")
my_transform = MyTransform()
# Create a pipeline
pipeline = Gst.Pipeline.new("mypipeline")
# Add elements into the pipeline
pipeline.add(src)
pipeline.add(my_transform)
pipeline.add(sink)
# Link elements
src.link(my_transform)
my_transform.link(sink)
# Start playing
pipeline.set_state(Gst.State.PLAYING)
# Wait until error or EOS
bus = pipeline.get_bus()
msg = bus.timed_pop_filtered(Gst.CLOCK_TIME_NONE, Gst.MessageType.ERROR | Gst.MessageType.EOS)
# Free resources
pipeline.set_state(Gst.State.NULL)
```
<br/>
<br/>
* Error message:
```log
(<unknown>:56320): GStreamer-Base-CRITICAL **: 15:03:39.106: gst_base_transform_init: assertion 'pad_template != NULL' failed
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2663apedemux: Doesn't map encoder and cover art tags from a WavPack file2023-06-13T11:11:10ZGaël Bonithonapedemux: Doesn't map encoder and cover art tags from a WavPack fileI'm using GStreamer 1.20.1 on Arch Linux.
I took a wavpack file from https://filesamples.com/categories/audio, to which I added an image by `wvtag --write-binary-tag 'Cover Art (Front)=@/path/to/image' file.wv`.
The image displays corr...I'm using GStreamer 1.20.1 on Arch Linux.
I took a wavpack file from https://filesamples.com/categories/audio, to which I added an image by `wvtag --write-binary-tag 'Cover Art (Front)=@/path/to/image' file.wv`.
The image displays correctly in VLC and MPV for example, but `gst-discoverer-1.0` gives the following output:
```
Properties:
Duration: 0:01:45.772947845
Seekable: yes
Live: no
unknown #0: APE tag
audio #1: Wavpack
Stream ID: 980502cee530e5b6dddb0f749c5f9469a0a54864762b4d74c549f3cedefeb3c6
Language: <unknown>
Channels: 2 (front-left, front-right)
Sample rate: 44100
Depth: 16
Bitrate: 1026898
Max bitrate: 0
```
Initially I try to find this image in a code using the `gst_tag_list_get_sample_index()` function, which fails. But since it's hard for me to isolate a simple piece of code for demonstration, I preferred to post about `gst-discoverer-1.0`.
Here is the downstream issue for all intents and purposes: https://gitlab.xfce.org/xfce/tumbler/-/issues/46#note_44938https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2658discoverer: Misidentifies some image files files as MPEG, leading to high CPU...2023-06-12T10:38:59ZSam Thursfielddiscoverer: Misidentifies some image files files as MPEG, leading to high CPU usage and system lockupsThe .dds (DirectDraw Surface) file format is handled badly by GstDiscoverer.
To reproduce, you can try this program:
```
import gi
gi.require_version('Gst', '1.0')
gi.require_version('GstPbutils', '1.0')
from gi.repository import Gst, ...The .dds (DirectDraw Surface) file format is handled badly by GstDiscoverer.
To reproduce, you can try this program:
```
import gi
gi.require_version('Gst', '1.0')
gi.require_version('GstPbutils', '1.0')
from gi.repository import Gst, GstPbutils
import pathlib
import sys
Gst.init(sys.argv)
discoverer = GstPbutils.Discoverer.new(5 * Gst.SECOND)
path = pathlib.Path(sys.argv[1])
print('Reading: {}'.format(path))
result = discoverer.discover_uri(path.absolute().as_uri())
print(result)
```
Run it against this .dds file:[WTF.dds](/uploads/e25dbf033387a77ecc5eab7b7f554fed/WTF.dds).
At best you'll see the format detection taking a very long time, and giving an incorrect result. At worst, at least on my Fedora 30 machine, this triggers some bug in Linux that causes the whole system to hang.
This has been causing problems in conjunction with the Tracker indexer, as Tracker may scan a directory full of .dds files with [unfortunate results](https://gitlab.gnome.org/GNOME/tracker/issues/95). We can avoid this in Tracker by [blocklisting .dds files](https://gitlab.gnome.org/GNOME/tracker-miners/merge_requests/119), and perhaps reducing the timeout that we pass to `gst_discoverer_new()` but it would be nice to also fix GstDiscoverer if that's possible!
Since reporting this issue, it's been reproduced using `image/ktx` and `image/x-tga` files too.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2655uridecodebin: Add properties and documentation for better control over buffering2023-08-03T12:52:05ZBugzilla Migration Useruridecodebin: Add properties and documentation for better control over buffering## Submitted by Carlos Rafael Giani
**[Link to original bug (#762125)](https://bugzilla.gnome.org/show_bug.cgi?id=762125)**
## Description
Created attachment 321347
uridecodebin patch for enhanced buffering control
Currently,...## Submitted by Carlos Rafael Giani
**[Link to original bug (#762125)](https://bugzilla.gnome.org/show_bug.cgi?id=762125)**
## Description
Created attachment 321347
uridecodebin patch for enhanced buffering control
Currently, there are only two properties for controlling the buffering: buffer-size and buffer-duration. Properties for the low/high percentage thresholds are missing. Also, the default value for buffer-size and buffer-duration is -1, which means "automatic/default". It is not obvious what this exactly means, and relies on hardcoded internal values.
Furthermore, buffer-duration conflates two distinct concepts: its value is used both for bitrate-based and for input data rate based buffer size estimation. As a result, a buffer-duration value of for example 5 seconds can either mean a buffer size of bitrate*5 seconds , or in case of slow connections, in-data-rate*5 seconds, whichever is lower. In many cases, it is desirable to use only one of these two estimations.
The exact way how buffering behaves, how the properties work, and how it should be used is not documented.
This is the first version of a patch that deprecates buffer-size and buffer-duration in favor of three new properties: max-buffer-size, max-buffering-duration, and buffer-estimate-duration.
The new properties work as follows:
* max-buffer-size: The upper limit for the buffer size, in bytes; this value is passed to the internal queue/decodebin as the "max-size-bytes" property; default value is 10 MB
* max-buffering-duration: The in-data-rate*duration estimate mentioned above; this value is passed to the internal queue/decodebin as the "max-size-time" property, but it is *not* used for bitrate based estimations; default value is 0 (= disables data rate based estimates)
* buffer-estimate-duration: The bitrate*duration estimate mentioned above; this value is not passed to the internal queue, and used only if a bitrate tag is encountered; default value is 6.5 seconds
Out of these three, the lowest size (in bytes) is picked.
The patch also makes it possible to set these property during playback; the buffer size will be readjusted on the fly.
Properties for low/high percentage are also introduced. Default values are: low 5%, high 5%. Together with the default values for the other three properties, this means buffering messages will reach 100% once 1.5 seconds are buffered. During playback, if the source can deliver data faster than realtime, additional 5 second can be buffered on the fly. This makes streaming playback more robust against network bandwidth drops without having to let the user wait too long for buffering to finish.
gtkdoc documentation for how to use these new properties and how configuring buffer size works is also added.
Also, a new "will-post-buffering" signal is added. This is emitted whenever uridecodebin sets the "use-buffering" property of an internal queue to TRUE. This is useful for applications to let them know that they should *not* switch to PLAYING just yet, because buffering messages *will* be posted soon. This prevents the possibility that the PLAYING state is reached, playback goes on briefly, and then the application receives the first BUFFERING message, and pauses playback again.
In subsequent patches, playbin could also get these new properties (they'd be forwarded to uridecodebin just like buffer-size and buffer-duration are now), and the new signal. Another planned addition is a "current-buffer-level" property; however, this first requires a patch for multiqueue, since it doesn't have any property like that (queue2 does have "current-level-bytes"). Also, several parsers such as flacparse, wavparse, aiffparse have been found to not push bitrate tags downstream, and therefore also require patching to further improve buffering behavior.
**Patch 321347**, "uridecodebin patch for enhanced buffering control":
[0001-uridecodebin-Add-properties-and-signals-for-better-c.patch](/uploads/dab7fd226cd0303c015cf427dd222271/0001-uridecodebin-Add-properties-and-signals-for-better-c.patch)Edward HerveyEdward Herveyhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2632videoflip: Missing support for ARGB64 video2023-06-04T10:32:07ZRuslan Khamidullinvideoflip: Missing support for ARGB64 videoGStreamer version: 1.16.2.
Operating system: Windows 8.1 x64 (desktop), macOS 10.14.6.
**Reproduce:** run the following command (substitute the real video file path):
`gst-launch-1.0 uridecodebin uri="file:///path/to/movie" ! videocon...GStreamer version: 1.16.2.
Operating system: Windows 8.1 x64 (desktop), macOS 10.14.6.
**Reproduce:** run the following command (substitute the real video file path):
`gst-launch-1.0 uridecodebin uri="file:///path/to/movie" ! videoconvert ! video/x-raw, format=ARGB64 ! videoflip video-direction=vert ! videoconvert ! pngenc ! multifilesink max-files=1 location=gst_dec_%05d.png`
**Expected:** a flipped frame from the video as a PNG file.
**Actual:** no PNG file, a stderr message instead: `WARNING: erroneous pipeline: could not link videoconvert0 to videoflip0, videoflip0 can't handle caps video/x-raw, format=(string)ARGB64`.
**Note:** if we replace `ARGB64` with `RGBA`, the pipeline works as expected.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2622GstMemory needs warnings in documentation or rework2023-06-12T13:05:45ZCélestin Marotmarotcelestin@gmail.comGstMemory needs warnings in documentation or rework**A *GstMemory* object created with `gst_memory_new_wrapped(0, ...)` can easily induce data-races.**
As you know, it is a data-race to `memcpy` a memory block that is being written to. So one would think there is a mechanism to avoid wr...**A *GstMemory* object created with `gst_memory_new_wrapped(0, ...)` can easily induce data-races.**
As you know, it is a data-race to `memcpy` a memory block that is being written to. So one would think there is a mechanism to avoid writable memory from being copied without locking that memory or without having a reference count equal to 1. However, the longer I dig in the code, the more I realize that GstMemory is only safe when it is directly wrapped in a buffer and then accessed only through *GstBuffer*'s API.
Let me detail this: `gst_memory_new_wrapped()` will simply call `_sysmem_new()` which calls `_sysmem_init()` which in turns calls `gst_memory_init()` with `_sysmem_allocator` as allocator. `_sysmem_allocator` is a *GstAllocator* of type *GstAllocatorSysmem*, which uses `_sysmem_copy()` as copy function. That function basically does an allocation followed by a `memcpy()` without any thread safety mechanism.
Notice that `_fallback_mem_copy()` maps the memory with read access before copying but that function is not used with the *GstAllocatorSysmem*. Hence the behavior is not uniform among allocators.
Therefore, when one calls `gst_memory_copy()` on memory created with *GstAllocatorSysmem*, it will simply do a `memcpy()` without any protection against possible data-races. No, the user is not supposed to lock the memory before doing a copy: it is not specified anywhere, and even functions like `gst_buffer_append_memory()` will simply call `gst_memory_copy()` if the memory is locked (potentially with write access) by another buffer (see ´_memory_get_exclusive_reference()´).
In my humble opinion, in addition to being unnecessarily complicated (see https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/302), the behavior of *GstMemory* is broken in the sense that it does not offer the guarantee one would expect when reading its API. If copying memory requires mapping/locking it with read access, it must be clearly specified and done in GStreamer internal code like ´_memory_get_exclusive_reference()´ and removed from allocators *copy* functions like `_fallback_mem_copy()`. The mapping/locking can also be done in the `gst_memory_copy()` function itself or in the allocator's *copy* function. Consistency is probably the key here.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2615qmloverlay: running example in eglfs platform fails2023-05-30T09:59:10ZDariusz Venovskiqmloverlay: running example in eglfs platform failsCompiling and running qmloverlay example in the gst-plugins-good/test/examples/qt dir.
App runs fine under X, but when run over eglfs platform fails with error:
```EGLFS: OpenGL windows cannot be mixed with others.```
- GStreamer 1.21....Compiling and running qmloverlay example in the gst-plugins-good/test/examples/qt dir.
App runs fine under X, but when run over eglfs platform fails with error:
```EGLFS: OpenGL windows cannot be mixed with others.```
- GStreamer 1.21.3 (GIT)
- Qt 5.15.2
- Linuxhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2605playbin3: more flexible URI property2023-07-03T14:42:59ZGuillaume Desmottesplaybin3: more flexible URI property`playbin3` currently implements the same URI properties as `playbin`:
- `uri`: URI of the media to play
- `suburi`: Optional URI of a subtitle
- `current-uri`: The currently playing URI
- `current-suburi`: The currently playing URI of a ...`playbin3` currently implements the same URI properties as `playbin`:
- `uri`: URI of the media to play
- `suburi`: Optional URI of a subtitle
- `current-uri`: The currently playing URI
- `current-suburi`: The currently playing URI of a subtitle
`suburi` can actually be used to play audio or video, at least with `playbin`, see gst-plugins-base#690, no the naming isn't great.
Also, one may want to play 3 streams together (audio, video, text) which is currently not possible. Or even be able to quickly switch between different streams of the same type.
I was thinking we could implement something more flexible and generic using the new stream APIs:
- add a new `uris` property taking an array of URIs
- expose them all using `GstStreamCollection`
- activate by default the first audio/video/text streams
- let user activate specific streams using `GST_EVENT_SELECT_STREAMS`
- add a new `current-uris` property containing the subset of `uris` currently activated.
@bilboed : any thoughts?