gstreamer issueshttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues2024-02-26T08:55:45Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3337csharp: Split GStreamer libraries into separate DLLs for bindings2024-02-26T08:55:45ZPiotr Brzezińskicsharp: Split GStreamer libraries into separate DLLs for bindingsAs per discussion in https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5961#note_2258719, ideally each GStreamer library should have a separate DLL containing its own bindings, like GES does now. NuGet packaging can sta...As per discussion in https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5961#note_2258719, ideally each GStreamer library should have a separate DLL containing its own bindings, like GES does now. NuGet packaging can stay as-is.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/45Need a standard way to get format-specific stream ID (for HTML5 track "id")2024-02-12T13:01:42ZBugzilla Migration UserNeed a standard way to get format-specific stream ID (for HTML5 track "id")## Submitted by Brendan Long
**[Link to original bug (#711522)](https://bugzilla.gnome.org/show_bug.cgi?id=711522)**
## Description
I'm trying to implement the "id" attribute on HTML5 TextTrack, AudioTrack and VideoTrack objects:
...## Submitted by Brendan Long
**[Link to original bug (#711522)](https://bugzilla.gnome.org/show_bug.cgi?id=711522)**
## Description
I'm trying to implement the "id" attribute on HTML5 TextTrack, AudioTrack and VideoTrack objects:
http://www.w3.org/TR/html5/embedded-content-0.html#dom-texttrack-id
http://www.w3.org/TR/html5/embedded-content-0.html#dom-audiotrack-id
Currently, what the "id" is isn't standardized, but the WG is starting soon and presumably it will be the standard ID for each format (PID for MPEG-TS, stream serial number for Ogg, track UUID for Matroska).
Some options I considered:
* Stream ID: This is nice because every stream has one, and it contains the value we want at least in Matroska. One problem is that the stream ID isn't currently defined to have any meaning besides being a unique identifier (although it could be if we wanted it to). The other problem is that all streams need a stream ID, but not all formats actually have an internal ID (the output of audiotestsrc, or WAV files for example).
* GST_TAG_SERIAL: This is defined as an unsigned integer, and not all formats use an unsigned integer for their ID's (Matroska comes to mind). Even if all formats did now, it wouldn't be future-proof.
My suggestion is to add a GST_TAG_ID, which is defined as "the ID for this stream/track in the original format", which could then contain the PID, stream serial, etc.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/4GstBin: re-enable or remove duration caching2024-01-29T10:18:27ZBugzilla Migration UserGstBin: re-enable or remove duration caching## Submitted by Tim Müller `@tpm`
**[Link to original bug (#324807)](https://bugzilla.gnome.org/show_bug.cgi?id=324807)**
## Description
Something seems to be wrong with duration query caching in bins.
I've got a playbin using ...## Submitted by Tim Müller `@tpm`
**[Link to original bug (#324807)](https://bugzilla.gnome.org/show_bug.cgi?id=324807)**
## Description
Something seems to be wrong with duration query caching in bins.
I've got a playbin using a cd audio source. When the source changes tracks, it posts a new DURATION message on the bus with the duration of the new track. That should clear the existing cached durations, yet further gst_element_query (playbin, ...) after that return the same duration as previous calls.
Haven't had a chance to look into it yet. Things work fine though if I comment out the block where it returns cached durations in gstbin.c, gst_bin_query().https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3filesink: non-fatal out of disk space handling2024-01-29T10:18:23ZBugzilla Migration Userfilesink: non-fatal out of disk space handling## Submitted by Tim Müller `@tpm`
**[Link to original bug (#309117)](https://bugzilla.gnome.org/show_bug.cgi?id=309117)**
## Description
What would be really nice to have in filesink (and gnomevfssink as well probably
if possible)...## Submitted by Tim Müller `@tpm`
**[Link to original bug (#309117)](https://bugzilla.gnome.org/show_bug.cgi?id=309117)**
## Description
What would be really nice to have in filesink (and gnomevfssink as well probably
if possible) is non-fatal out-of-disk-space handling.
File sink could fire an out-of-diskspace signal and put the buffer it wanted to
write in an internal queue. The application can pause the pipeline at this point
and show a warning to the user. Once the user has freed some space, the
application can continue doing what it was doing before, without a single byte lost.
This would be a very neat function to have for (non-realtime) encoders,
transcoders, sound-juicer IMHO.
Cheers
-Tim
### Blocking
* [Bug 318853](https://bugzilla.gnome.org/show_bug.cgi?id=318853)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3179dvdspu: figure out how to make it work with hardware decoders and subpicture ...2024-01-02T15:05:45ZBugzilla Migration Userdvdspu: figure out how to make it work with hardware decoders and subpicture overlays## Submitted by Jan Schmidt `@thaytan`
Assigned to **Jan Schmidt `@thaytan`**
**[Link to original bug (#685282)](https://bugzilla.gnome.org/show_bug.cgi?id=685282)**
## Description
Getting the DVD SPU to paint generically is a req...## Submitted by Jan Schmidt `@thaytan`
Assigned to **Jan Schmidt `@thaytan`**
**[Link to original bug (#685282)](https://bugzilla.gnome.org/show_bug.cgi?id=685282)**
## Description
Getting the DVD SPU to paint generically is a requirement for allowing the DVD elements to plug / output hardware decoder caps.
Here's a conversation we had about it on IRC:
Sep 26 01:37:04 * thaytan wonders how SPU works
Sep 26 01:37:19 `<bilboed>` thaytan, with vdpau ?
Sep 26 01:37:24 `<thaytan>` *nod*
Sep 26 01:37:35 `<thaytan>` how it should work, that is
Sep 26 01:37:36 `<bilboed>` they have also VdpBitmapSurface
Sep 26 01:38:01 `<bilboed>` so VdpVideoSurface => YUV stuff, VdpBitmapSurface => RGB stuff
Sep 26 01:38:11 `<bilboed>` VdpOutputSurface => output of the compositor for display
Sep 26 01:38:40 `<bilboed>` you can create VdpBitmapSurface and map(write) stuff on it
Sep 26 01:39:01 `<bilboed>` then you give it to the compositor with coordinates (and whatever else) and that's basically it
Sep 26 01:39:09 `<thaytan>` hmmm
Sep 26 01:39:26 `<bilboed>` so as a result ... I'm also gonna have to figure out how to solve hw-compositing
Sep 26 01:39:28 `<thaytan>` not sure I see how that fits into a GStreamer/rsndvdbin context
Sep 26 01:39:31 `<bilboed>` in a generic fashion that is
Sep 26 01:39:55 `<bilboed>` thaytan, was thinking you could slap GstOverlayMeta (or sth like that) with attached GstBuffers
Sep 26 01:40:06 `<bilboed>` thaytan, maybe videomixer or some generic element could do that
Sep 26 01:40:09 `<thaytan>` I guess it's a vdpspu element with video and subpicture inputs as currently with dvdspu
Sep 26 01:40:25 `<thaytan>` except the video pad accepts vdp output surface caps
Sep 26 01:40:38 `<bilboed>` I'd prefer to avoid creating yet-another-custom-element
Sep 26 01:40:45 `<thaytan>` but, I don't know what it outputs
Sep 26 01:41:12 `<thaytan>` bilboed: I don't know how to build it generically
Sep 26 01:41:48 `<thaytan>` the dvdspu element uses the video stream passing through to a) paint onto b) uses the timestamps to crank the SPU state machine
Sep 26 01:42:44 `<bilboed>` what do you need more apart from "put this RGB bitmap at these coordinates for this timestamp and this duration"
Sep 26 01:43:48 `<thaytan>` it needs the timestamps and segment info on the video stream so it knows which pixels to generate
Sep 26 01:44:04 `<thaytan>` it's more "here's a video frame, what's the overlay?"
Sep 26 01:44:13 `<thaytan>` also, dvdspu works in YUV too
Sep 26 01:44:13 `<bilboed>` sure, but it doesn't care about the *content* of that frame
Sep 26 01:44:28 `<thaytan>` bilboed: not if it's not doing the compositing, no
Sep 26 01:44:34 `<bilboed>` right
Sep 26 01:45:11 `<bilboed>` so it could see the stream go through, watch/collect segment/timestamps and decide what subpicture to attach to it (without *actually* doing any processing and letting downstream handle that)
Sep 26 01:45:13 `<thaytan>` but the model is still to pass the video buffer stream through the spu element so it can see the timing info it needs, right?
Sep 26 01:45:22 `<thaytan>` oh, of course
Sep 26 01:45:28 `<thaytan>` that's what I was suggesting, I guess I wasn't clear
Sep 26 01:45:35 `<__tim>` thaytan, dvdspu should support GstVideoOverlayComposition imho
Sep 26 01:45:38 `<bilboed>` I'm not *that* familiar with SPU
Sep 26 01:45:42 `<bilboed>` also, what __tim said :)
Sep 26 01:45:54 `<bilboed>` like that I don't have to solve it in 500 different elements
Sep 26 01:45:56 `<thaytan>` ok, I guess I'll have to look at the GstVideoOverlayComposition API
Sep 26 01:46:54 * bilboed is not looking forward "at all" to fixing this for cluster-gst
Sep 26 01:47:12 `<__tim>` it's very dumb, you can provide one or more ARGB or AYUV rectangles and either use helper API to put them onto the raw video, or attach them to the buffer; the sink or whatever can then take over the overlaying using that
Sep 26 01:47:40 `<__tim>` and it will do a bunch of conversions for you and cache them if the sink or whatever does the overlaying doesn't support what you supplied
Sep 26 01:48:54 `<thaytan>` well, that sounds feasible - although less efficient than dvdspu painting natively if the composite is in software
Sep 26 01:49:28 `<thaytan>` maybe it can be extended to add RLE AYUV rectangles as a format though?
Sep 26 01:49:44 `<__tim>` thaytan, how sure are you of that? because basically you have to parse the RLE data for every single frame, right? is that really so much faster than blitting some ready-made rectangle using orc?
Sep 26 01:50:04 `<thaytan>` dvdspu gets to skip a lot of transparent pixels
Sep 26 01:52:21 `<__tim>` yeah, but it's if else and loops etc. You might be right, I'm just curious how much difference it actually makes. Also, you don't have to use the API to blit your pixels, you can still do that as you do now and only attach the AYUV rectangle if downstream supports that
Sep 26 01:54:13 `<__tim>` it's just convenient because you only have one code path
Sep 26 01:55:01 `<thaytan>` it sounds like a structural improvement
### Blocking
* [Bug 663750](https://bugzilla.gnome.org/show_bug.cgi?id=663750)
* [Bug 725900](https://bugzilla.gnome.org/show_bug.cgi?id=725900)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2443gst-rtsp-server: Decrease RTSP setup time2023-10-02T17:14:50ZPatricia Muscalugst-rtsp-server: Decrease RTSP setup timeProblem:<br>
Setting suspend mode RESET on an RTSP media brings the live pipeline to NULL state on suspend.
When a PLAY request arrives, the pipeline is unsuspended and set to the PLAYING state. The source element has to release and reop...Problem:<br>
Setting suspend mode RESET on an RTSP media brings the live pipeline to NULL state on suspend.
When a PLAY request arrives, the pipeline is unsuspended and set to the PLAYING state. The source element has to release and reopen stream
resources and this operation, on our case, is time consuming due to device resource limits.
Solution:<br>
Introduce a new suspend mode, that doesn't change the state of the media on suspend, meaning that the media is still PLAYING but remains blocked. When the media is unsuspended, the GstForceKeyUnit event is sent to the pipeline and the pipeline is flushed. The new generated key frame will unblock the media.
The patch will be provided.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2941gl: Add I420_10LE support2023-09-19T23:06:19ZNicolas Dufresnegl: Add I420_10LE supportA drive by small enhancement we could do, this is the 10bit format produced by FFMPEG HEVC decoder (at least). It would allow skipping videoconvert for:
```
... ! avdec_h265 ! glimagesink
```
With 10bit files. Please note that unlike P...A drive by small enhancement we could do, this is the 10bit format produced by FFMPEG HEVC decoder (at least). It would allow skipping videoconvert for:
```
... ! avdec_h265 ! glimagesink
```
With 10bit files. Please note that unlike P010, the padding bits are placed in the most significant bits.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2828rtmp2src: no way to know if stream was properly terminated or not on eos2023-08-04T12:24:33ZGuillaume Desmottesrtmp2src: no way to know if stream was properly terminated or not on eosIf the RTMP server closes the connection for any reason, `rtmp2src` will [log the disconnection as INFO](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L927), stop its s...If the RTMP server closes the connection for any reason, `rtmp2src` will [log the disconnection as INFO](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L927), stop its streaming task and [return `GST_FLOW_EOS`](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L614) next time `gst_rtmp2_src_create()` is called.
`EOS` is also returned if the stream has been [properly terminated by the server](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L941).
I need to have a way to distinguish between these two scenarios in my application. In the first case I want to retry connecting to the RTMP server while it the second I terminate the streaming pipeline. Unfortunately there is currently no way to know the actual reason why `EOS` is being returned.
Note that `EOS` is [also returned because of the idle timeout](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L617) but I don't actually use that.
The obvious solution would be to return `GST_FLOW_ERROR` instead of `EOS` on connection error. But I suppose there was a good reason to not do that in the first place? Like some server never properly terminating the stream when it's done?
So maybe this should be controlled using a new property?
@heftig @vivia : you seem to have done some work on this plugin so maybe you know more about why we handle connection error this way?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1608nvav1enc:Add support for av1 hardware encoding2023-07-24T19:47:01Z谢美龙nvav1enc:Add support for av1 hardware encodingThe 8th generation nvenc has added support for av1 encoding. [Support Matrix](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new).
And rtpav1pay just released in gstreamer 1.21.1. The nvav1enc + rtpav1pay + webr...The 8th generation nvenc has added support for av1 encoding. [Support Matrix](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new).
And rtpav1pay just released in gstreamer 1.21.1. The nvav1enc + rtpav1pay + webrtc will be a nice combination.
[orign issue](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1489)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1259Windows PTP support not integrated into build system2023-06-29T10:00:05ZSebastian DrögeWindows PTP support not integrated into build systemSee https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gstreamer/libs/gst/helpers/meson.build#L30-31
```meson
elif host_system == 'windows'
message('PTP not supported on Windows, not ported yet.')
```
The PTP ...See https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gstreamer/libs/gst/helpers/meson.build#L30-31
```meson
elif host_system == 'windows'
message('PTP not supported on Windows, not ported yet.')
```
The PTP code contains Windows-specific code paths so I assume this was working with the autotools build.
CC @nirbheekSebastian DrögeSebastian Drögehttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/291Drop use of GSlice in GStreamer2023-06-26T12:10:54ZBugzilla Migration UserDrop use of GSlice in GStreamer## Submitted by Tim Müller `@tpm`
**[Link to original bug (#795828)](https://bugzilla.gnome.org/show_bug.cgi?id=795828)**
## Description
Created attachment 371701
Drop use of GSlice allocator in favour of plain g_malloc()/g_free()...## Submitted by Tim Müller `@tpm`
**[Link to original bug (#795828)](https://bugzilla.gnome.org/show_bug.cgi?id=795828)**
## Description
Created attachment 371701
Drop use of GSlice allocator in favour of plain g_malloc()/g_free()
Just gonna put this here. Enjoy.
The "Benchmarks show" bit could probably need more verification, but in the few tests I have run I have found either positive performance impact or no discernable performance impact. It's hard to do proper measurements for our purposes, we need to test alloc from one thread and free in another thread for realistic usage, while at the same time having a test case where allocation/free takes up most of the cycles. I have run some tests on a low-powered windows ec2 machine, that showed GSLice being ridiculously slow compared to the system allocator there (windows server 2006 =~ win10).
IIRC main reason to use GSlice was because the sys allocator on` ~Windows` XP was horrendous, but that doesn't seem to be the case any longer, and on Linux the glibc allocator seems superior nowadays.
**Patch 371701**, "Drop use of GSlice allocator in favour of plain g_malloc()/g_free()":
[0001-Drop-use-of-GSlice-allocator-in-favour-of-plain-g_ma.patch](/uploads/438addaac2f08f0d1693e214ed8bab72/0001-Drop-use-of-GSlice-allocator-in-favour-of-plain-g_ma.patch)
### See also
* [Bug 754687](https://bugzilla.gnome.org/show_bug.cgi?id=754687)1.23.1https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1703qt qml plugin for QT6 is not built on Windows2023-06-23T22:08:53ZMohammad Sadeghzadehqt qml plugin for QT6 is not built on WindowsWe have cloned version `1.21.3` and set meson options to build gstreamer using `qt6 `but it is not build on `Windows `and `msvc 2019`.We have cloned version `1.21.3` and set meson options to build gstreamer using `qt6 `but it is not build on `Windows `and `msvc 2019`.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2624flv/rtmp: Add support for "enhanced" RTMP2023-06-23T09:49:53ZSebastian Drögeflv/rtmp: Add support for "enhanced" RTMPSee https://github.com/veovera/enhanced-rtmp . Among other things this adds H265 support. There are patches for ffmpeg already.See https://github.com/veovera/enhanced-rtmp . Among other things this adds H265 support. There are patches for ffmpeg already.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2479clockoverlay: how to overlay a d3d11 surface?2023-06-15T15:19:39ZMaurizio Buratoclockoverlay: how to overlay a d3d11 surface?i need to overlay date and time text on d3d11 surfaces
i tried:
gst-launch-1.0.exe -v d3d11testsrc ! video/x-raw(memory:D3D11Memory) ! clockoverlay halignment=2 valignment=1 time-format="%e %b %Y %H:%M:%S" ! nvd3d11h264enc ! h264parse ...i need to overlay date and time text on d3d11 surfaces
i tried:
gst-launch-1.0.exe -v d3d11testsrc ! video/x-raw(memory:D3D11Memory) ! clockoverlay halignment=2 valignment=1 time-format="%e %b %Y %H:%M:%S" ! nvd3d11h264enc ! h264parse ! fakesink
but i get
WARNING: erroneous pipeline: could not link clockoverlay0 to nvd3d11h264enc0
is there another method?
thankshttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2643gstplayer: Add gst_player_get_state API2023-06-12T14:47:59ZBugzilla Migration Usergstplayer: Add gst_player_get_state API## Submitted by Lyon
**[Link to original bug (#778379)](https://bugzilla.gnome.org/show_bug.cgi?id=778379)**
## Description
For gstpalyer state, currently we can only get the state by state_change callback, when mainloop start runni...## Submitted by Lyon
**[Link to original bug (#778379)](https://bugzilla.gnome.org/show_bug.cgi?id=778379)**
## Description
For gstpalyer state, currently we can only get the state by state_change callback, when mainloop start running.
However, if we need to get the current state when mainloop has not started running. There is no way to get the gstplayer state.
So considering add this gst_player_get_state() API to get the current player state.
Version: 1.xhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/292utils: Add a "gst_element_create_stream_id" method2023-06-06T11:56:35ZBugzilla Migration Userutils: Add a "gst_element_create_stream_id" method## Submitted by Edward Hervey `@bilboed`
**[Link to original bug (#795976)](https://bugzilla.gnome.org/show_bug.cgi?id=795976)**
## Description
Currently the various gst_pad_create_stream_id functions requires a GstPad. But if you u...## Submitted by Edward Hervey `@bilboed`
**[Link to original bug (#795976)](https://bugzilla.gnome.org/show_bug.cgi?id=795976)**
## Description
Currently the various gst_pad_create_stream_id functions requires a GstPad. But if you use the stream_id argument ... it doesn't use that pad.
For stream providers (like demuxers) who wish to create stream ids (based on upstream information) without creating pads it would be great to have similar functions that don't require a pad.Edward HerveyEdward Herveyhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/238Need API to know allocated buffers from upstream2023-06-01T16:28:14ZBugzilla Migration UserNeed API to know allocated buffers from upstream## Submitted by Guillaume Desmottes `@gdesmott`
**[Link to original bug (#783085)](https://bugzilla.gnome.org/show_bug.cgi?id=783085)**
## Description
We are working on implementing DMA support in gst-omx. When acting as a DMA impor...## Submitted by Guillaume Desmottes `@gdesmott`
**[Link to original bug (#783085)](https://bugzilla.gnome.org/show_bug.cgi?id=783085)**
## Description
We are working on implementing DMA support in gst-omx. When acting as a DMA importer, an element needs to know beforehand which buffers have been allocated upstream so it can call OMX_UseBuffer() on each allocated buffer it's going to use.
We currently solved this using a custom downstream event sent in gst_v4l2src_decide_allocation() but it would be good to have a proper solution for this.
We could add a "allocated-buffers" serialized downstream event which is sent when buffers are (re)allocated before starting to use them.
It would include a GPtrArray of GstBuffers.
We should probably be clear in the doc on what can and cannot be done with those buffers. Something like "Those buffers should not be modified in anyway."
Thoughts?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/49avoid decoding not-linked streams2023-03-24T07:11:27ZBugzilla Migration Useravoid decoding not-linked streams## Submitted by Thiago Sousa Santos `@thiagossantos`
**[Link to original bug (#722666)](https://bugzilla.gnome.org/show_bug.cgi?id=722666)**
## Description
Since the 0.10 -> 1.0 move, the feature that was used to make not-linked str...## Submitted by Thiago Sousa Santos `@thiagossantos`
**[Link to original bug (#722666)](https://bugzilla.gnome.org/show_bug.cgi?id=722666)**
## Description
Since the 0.10 -> 1.0 move, the feature that was used to make not-linked streams skip decoding until linked again is gone.
It was done by input-selector returning not-linked on pad-allocs and the decoder would drop that buffer and return not-linked. Nothing has replaced this behavior in 1.0 (yet).
This can be easily demonstrated with the following scenario:
1) Create a test clip with multiple videos:
gst-launch-1.0 videotestsrc num-buffers=500 pattern=0 ! x264enc ! qtmux name=m ! filesink location=3video.mov videotestsrc num-buffers=500 pattern=1 ! x264enc ! m. videotestsrc num-buffers=500 pattern=18 ! x264enc ! m.
2) Play the clip and look at the debug logs for the video decoders:
GST_DEBUG=*videodec*:9,*avdec*:9 gst-launch-1.0 playbin uri=file:///path/to/3video.mov
A similar scenario is solved in dash/mss demuxers using the not-linked return and reconfigure events to stop/reactivate those streams. The use case is a bit different but I guess the mechanism can work the same.
Possible issues:
1) Is this a behavior break?
2) How to avoid dropping buffers that are ahead of the current play position? It would be better if the decoder would only drop a buffer to avoid decoding when it knew that this buffer is too late to be used.
3) How does this interfere with the buffer pools and allocation queries?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2384urisourcebin: src pads have no caps when added2023-03-20T13:45:32ZGuillaume Desmottesurisourcebin: src pads have no caps when addedI had to use `urisourcebin parse-streams=true` manually and it's been trickier than anticipated. One cannot just connect the `pad-added` signal, check the pad caps and add the elements to handle the stream as the pad at this point has no...I had to use `urisourcebin parse-streams=true` manually and it's been trickier than anticipated. One cannot just connect the `pad-added` signal, check the pad caps and add the elements to handle the stream as the pad at this point has no caps yet. Because of that I had to setup a blocking probe and wait for the caps event, which can be tricky if you're not so used with GStreamer.
Would it make sense to wait for the caps before exposing the `src` pads and so make all this more convenient for users?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1750va: Add support for libva-win322023-03-10T15:06:28ZSil Vilerinova: Add support for libva-win32Since [libva 2.17](https://github.com/intel/libva/releases/tag/2.17.0) there's a new VAAPI display "libva-win32" that allows VA acceleration on Windows OS. Similarly, [Mesa 22.3](https://docs.mesa3d.org/relnotes/22.3.0.html) released `va...Since [libva 2.17](https://github.com/intel/libva/releases/tag/2.17.0) there's a new VAAPI display "libva-win32" that allows VA acceleration on Windows OS. Similarly, [Mesa 22.3](https://docs.mesa3d.org/relnotes/22.3.0.html) released `vaon12_drv_video.dll`, a VA driver for Windows, which is based on D3D12 Video APIs and implements the following entrypoints (where hardware/drivers support is available) on Windows OS:
```
VAProfileH264ConstrainedBaseline: VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointEncSlice
VAProfileH264High : VAEntrypointVLD
VAProfileH264High : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointVLD
VAProfileHEVCMain : VAEntrypointEncSlice
VAProfileHEVCMain10 : VAEntrypointVLD
VAProfileHEVCMain10 : VAEntrypointEncSlice
VAProfileVP9Profile0 : VAEntrypointVLD
VAProfileVP9Profile2 : VAEntrypointVLD
VAProfileAV1Profile0 : VAEntrypointVLD
VAProfileNone : VAEntrypointVideoProc
```
Enhancing/extending gstreamer with the VAAPI `VADisplay` initialization code using the new `vaGetDisplayWin32` function would allow the vaapi* related plugins to work on Windows. Using it together with the `vaon12` mesa driver, it'd allow for access to a (layered via libva) D3D12 video support on gstreamer for decode, encode and some video processing as well on any device supporting D3D12 Video APIs. Please note the vaapi* plugins are already working via `libd3d12.so` and the `d3d12_drv_video.so` mesa driver in `Windows Subsystem for Linux` via the existing VAAPI DRM/X11 initialization paths.