gstreamer issueshttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues2021-09-29T16:18:09Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/474Merge all GStreamer modules into one single mono repository2021-09-29T16:18:09ZThibault Sauniertsaunier@igalia.comMerge all GStreamer modules into one single mono repositoryGitLab didn't like having such a big merge request so I am going to open an issue instead of a MR, the branch described here can be found here:
- https://gitlab.freedesktop.org/thiblahute/gstreamer/-/tree/monorepo_simple
This branch...GitLab didn't like having such a big merge request so I am going to open an issue instead of a MR, the branch described here can be found here:
- https://gitlab.freedesktop.org/thiblahute/gstreamer/-/tree/monorepo_simple
This branch lays out the foundations for merging all our repositories into
one.
This has been discussed in the community for a long time, and I *think* we have a
consensus about the principle of a mono repo.
We should make the goal of this discussion to elaborate a *practical*, explicit
consensus now. This is too important a decision to not have everyone in `@gstreamer`
on board and aware of the implications.
## Pros and cons of a monorepo
### Pros:
- Simplifies cross-module development, `gst-build` helps with that but there still exist friction points related to the current multi-repo approach
- Simplifies CI a lot
- ~~Simplifies the build definitions a lot~~ in the long run
- Makes it clearer what the official GStreamer modules are
- ~~Makes it possible to have experimental internal APIs usable in all the `gstreamer/` monorepo without exposing it outside GStreamer~~ in the long run
- Makes bisecting regressions much simpler
- Avoids having to duplicate changes in each and every repo when some GLib symbols are deprecated for example, or in meson.
- Simplifies the release process as we won't need to handle 20 different modules, but 1!
### Cons:
- Legal issues with ugly? How does FFmpeg handle these issues?
> We will still release the same modules so it shouldn't be a big problem
- Some disruption in the development and packaging process, this is a one time thing though
- The mono git repository is quite big (the .git is ~500MB)
- No bisecting across the merge point
`tim_entry_point()`
## Technical solution
To not lose any history and have all modules as one, without breaking single commits
sha, merging repositories has been done with the following pseudo code (from the GStreamer core repository):
```
foreach GSTREAMER_MODULE
git remote add GSTREAMER_MODULE.name GSTREAMER_MODULE.url
git fetch GSTREAMER_MODULE.name
git merge GSTREAMER_MODULE.name/master
git mv list_all_files_from_merged_gstreamer_module() GSTREAMER_MODULE.shortname
git commit -m "Moved all files from " + GSTREAMER_MODULE.name
endforeach
```
The exact code used can be found [here](https://gitlab.freedesktop.org/thiblahute/gst-merger/blob/master/merge.py), extended from @xclaesse
This approach has the following advantages:
- checksum of each individual commit is preserved
- we can cherry-pick commits from now merged modules, allowing pending merge requests to easily be reopened against the new monorepo
- We can cherry-pick patches from the monorepo to the individual previous modules repos for stable releases doing:
``` bash
cd $GST_BUILD_PATH/gst-plugins-base
git remote add monorepo https://gitlab.freedesktop.org/gstreamer/gstreamer.git
git fetch monorepo
git cherry-pick <sha from monorepo>
```
Note: I am not using [git subtree](https://manpages.debian.org/testing/git-man/git-subtree.1.en.html) which does almost the same thing as what we manually do here (but in one single commit) because `git log --follow some/file` file [doesn't work](https://stackoverflow.com/questions/10918244/git-subtree-without-squash-view-log) whereas it does with that solution. I can't think of a good reason to use it only that issue.
## What does that branch implement
* Merged modules with the new directory they landed in:
- gstreamer -> subprojects/gstreamer
- gst-plugins-base -> subprojects/gst-plugins-base/
- gst-plugins-good -> subprojects/gst-plugins-good/
- gst-plugins-bad -> subprojects/gst-plugins-bad/
- gst-plugins-ugly -> subprojects/gst-plugins-ugly/
- gstreamer-vaapi -> subprojects/gstreamer-vaapi/
- gst-omx -> subprojects/gst-omx/
- gst-libav -> subprojects/gst-libav/
- gst-ci -> ci/
- gst-rtsp-server -> subprojects/gst-rtsp-server/
- gst-editing-services -> subprojects/gst-editing-services/
- gst-devtools -> subprojects/gst-devtools/
- gst-docs -> subprojects/gst-docs/
- gst-examples -> subprojects/gst-examples/
- gst-integration-testsuites -> subprojects/gst-tests/integration/
- gst-python -> subprojects/gst-python/
- gstreamer-sharp -> subprojects/gstreamer-sharp/
- ~~gstreamer-rs -> subprojects/gstreamer-rs/~~
- ~~gst-plugins-rs -> subprojects/gst-plugins-rs~~
- gst-build -> scripts/ and tests/
* The `gstreamer` core module is very close to what `gst-build` was in term of implementation
* cerbero has been ported in https://gitlab.freedesktop.org/gstreamer/cerbero/-/merge_requests/653
* CI has been ported over
* Build products parity
## What is known to be missing
* Moving more files around manually: remove some READMES, add toplevel LICENSE/ etc..
* Fix documentation (although that was started already)
* See how we can better handle `modulename=auto` in meson
* Mass move issues from the different modules to the `gstreamer` project (should be simple to do in the UI)
## Plan
- [x] Agree on the wanted structure of the mono repo
- [x] Generate the monorepo again with latest GStreamer
- [x] Update to latest CI and make sure it passes - last 'scheduled' pipeline can be found [here](https://gitlab.freedesktop.org/thiblahute/gstreamer/-/pipelines/228623)
- includes everything from latest gst-ci
- includes everything from latest gstreamer-rs (only executed on 'scheduled' jobs
- details about cerbero related job can be found [here](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/474#note_689946)
- [x] Build a cerbero branch
- [~] Update the documentation
- [x] ~~Do mock releases of the various modules (Issue in meson to make it work fixed in https://github.com/mesonbuild/meson/pull/7907)~~ (do later)
- [x] Send a mail to the gstreamer-devel mailing list to let the broader community about the plan
This needs to contain a summary of the plan - Thread available [here](https://lists.freedesktop.org/archives/gstreamer-devel/2020-November/076754.html)
- [x] ~~Mass move issues from the different modules to the gstreamer project (simple to do in the UI)~~ (*No automatic mass-move of issues, we want to triage them, see monorepo FAQ*)
- [x] We should add tags to the issues before moving them (*Tags have been added*)
- [x] ~~Write a message on all opened MRs explaining how to move them to the mono repo and linking to this issue~~ (*No automatic mass-comment on open MRs, let's have active contributors move MRs via the script first, collect feedback to improve the script and then over time triage the remaining MRs and suggest they be moved and/or post link to script at a later time*)
- [x] First try to make a script to move open MRs into GStreamer core
- [x] Get https://github.com/mesonbuild/meson/pull/7907 merged
- [x] Create a temporary `monorepo` branch on origin (as proposed [here](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/474#note_1057178))
- [x] Open MR against that `monorepo` branch for review
- [x] ~~Once the monorepo is merged we should update READMEs in all merged repos to link with `gstreamer` core~~ (*clean-up for later; N.B. not all READMEs are the same*)
# Misc notes
* For modules that do not follow the exact same release cycle (only gstreamer-rs) the releases tag should be prefixed. **EDIT** - We won't make `gstreamer-rs` part of the monorepo for now as decided discussed [here](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/474#note_988425)1.19.3https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1703qt qml plugin for QT6 is not built on Windows2023-06-23T22:08:53ZMohammad Sadeghzadehqt qml plugin for QT6 is not built on WindowsWe have cloned version `1.21.3` and set meson options to build gstreamer using `qt6 `but it is not build on `Windows `and `msvc 2019`.We have cloned version `1.21.3` and set meson options to build gstreamer using `qt6 `but it is not build on `Windows `and `msvc 2019`.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/881AV1 RTP Payload support2022-09-12T16:02:34ZPhilippe NormandAV1 RTP Payload supportIt would be nice to have AV1 RTP (de)payloader implementing this spec: https://aomediacodec.github.io/av1-rtp-spec/It would be nice to have AV1 RTP (de)payloader implementing this spec: https://aomediacodec.github.io/av1-rtp-spec/1.21.1https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3179dvdspu: figure out how to make it work with hardware decoders and subpicture ...2024-01-02T15:05:45ZBugzilla Migration Userdvdspu: figure out how to make it work with hardware decoders and subpicture overlays## Submitted by Jan Schmidt `@thaytan`
Assigned to **Jan Schmidt `@thaytan`**
**[Link to original bug (#685282)](https://bugzilla.gnome.org/show_bug.cgi?id=685282)**
## Description
Getting the DVD SPU to paint generically is a req...## Submitted by Jan Schmidt `@thaytan`
Assigned to **Jan Schmidt `@thaytan`**
**[Link to original bug (#685282)](https://bugzilla.gnome.org/show_bug.cgi?id=685282)**
## Description
Getting the DVD SPU to paint generically is a requirement for allowing the DVD elements to plug / output hardware decoder caps.
Here's a conversation we had about it on IRC:
Sep 26 01:37:04 * thaytan wonders how SPU works
Sep 26 01:37:19 `<bilboed>` thaytan, with vdpau ?
Sep 26 01:37:24 `<thaytan>` *nod*
Sep 26 01:37:35 `<thaytan>` how it should work, that is
Sep 26 01:37:36 `<bilboed>` they have also VdpBitmapSurface
Sep 26 01:38:01 `<bilboed>` so VdpVideoSurface => YUV stuff, VdpBitmapSurface => RGB stuff
Sep 26 01:38:11 `<bilboed>` VdpOutputSurface => output of the compositor for display
Sep 26 01:38:40 `<bilboed>` you can create VdpBitmapSurface and map(write) stuff on it
Sep 26 01:39:01 `<bilboed>` then you give it to the compositor with coordinates (and whatever else) and that's basically it
Sep 26 01:39:09 `<thaytan>` hmmm
Sep 26 01:39:26 `<bilboed>` so as a result ... I'm also gonna have to figure out how to solve hw-compositing
Sep 26 01:39:28 `<thaytan>` not sure I see how that fits into a GStreamer/rsndvdbin context
Sep 26 01:39:31 `<bilboed>` in a generic fashion that is
Sep 26 01:39:55 `<bilboed>` thaytan, was thinking you could slap GstOverlayMeta (or sth like that) with attached GstBuffers
Sep 26 01:40:06 `<bilboed>` thaytan, maybe videomixer or some generic element could do that
Sep 26 01:40:09 `<thaytan>` I guess it's a vdpspu element with video and subpicture inputs as currently with dvdspu
Sep 26 01:40:25 `<thaytan>` except the video pad accepts vdp output surface caps
Sep 26 01:40:38 `<bilboed>` I'd prefer to avoid creating yet-another-custom-element
Sep 26 01:40:45 `<thaytan>` but, I don't know what it outputs
Sep 26 01:41:12 `<thaytan>` bilboed: I don't know how to build it generically
Sep 26 01:41:48 `<thaytan>` the dvdspu element uses the video stream passing through to a) paint onto b) uses the timestamps to crank the SPU state machine
Sep 26 01:42:44 `<bilboed>` what do you need more apart from "put this RGB bitmap at these coordinates for this timestamp and this duration"
Sep 26 01:43:48 `<thaytan>` it needs the timestamps and segment info on the video stream so it knows which pixels to generate
Sep 26 01:44:04 `<thaytan>` it's more "here's a video frame, what's the overlay?"
Sep 26 01:44:13 `<thaytan>` also, dvdspu works in YUV too
Sep 26 01:44:13 `<bilboed>` sure, but it doesn't care about the *content* of that frame
Sep 26 01:44:28 `<thaytan>` bilboed: not if it's not doing the compositing, no
Sep 26 01:44:34 `<bilboed>` right
Sep 26 01:45:11 `<bilboed>` so it could see the stream go through, watch/collect segment/timestamps and decide what subpicture to attach to it (without *actually* doing any processing and letting downstream handle that)
Sep 26 01:45:13 `<thaytan>` but the model is still to pass the video buffer stream through the spu element so it can see the timing info it needs, right?
Sep 26 01:45:22 `<thaytan>` oh, of course
Sep 26 01:45:28 `<thaytan>` that's what I was suggesting, I guess I wasn't clear
Sep 26 01:45:35 `<__tim>` thaytan, dvdspu should support GstVideoOverlayComposition imho
Sep 26 01:45:38 `<bilboed>` I'm not *that* familiar with SPU
Sep 26 01:45:42 `<bilboed>` also, what __tim said :)
Sep 26 01:45:54 `<bilboed>` like that I don't have to solve it in 500 different elements
Sep 26 01:45:56 `<thaytan>` ok, I guess I'll have to look at the GstVideoOverlayComposition API
Sep 26 01:46:54 * bilboed is not looking forward "at all" to fixing this for cluster-gst
Sep 26 01:47:12 `<__tim>` it's very dumb, you can provide one or more ARGB or AYUV rectangles and either use helper API to put them onto the raw video, or attach them to the buffer; the sink or whatever can then take over the overlaying using that
Sep 26 01:47:40 `<__tim>` and it will do a bunch of conversions for you and cache them if the sink or whatever does the overlaying doesn't support what you supplied
Sep 26 01:48:54 `<thaytan>` well, that sounds feasible - although less efficient than dvdspu painting natively if the composite is in software
Sep 26 01:49:28 `<thaytan>` maybe it can be extended to add RLE AYUV rectangles as a format though?
Sep 26 01:49:44 `<__tim>` thaytan, how sure are you of that? because basically you have to parse the RLE data for every single frame, right? is that really so much faster than blitting some ready-made rectangle using orc?
Sep 26 01:50:04 `<thaytan>` dvdspu gets to skip a lot of transparent pixels
Sep 26 01:52:21 `<__tim>` yeah, but it's if else and loops etc. You might be right, I'm just curious how much difference it actually makes. Also, you don't have to use the API to blit your pixels, you can still do that as you do now and only attach the AYUV rectangle if downstream supports that
Sep 26 01:54:13 `<__tim>` it's just convenient because you only have one code path
Sep 26 01:55:01 `<thaytan>` it sounds like a structural improvement
### Blocking
* [Bug 663750](https://bugzilla.gnome.org/show_bug.cgi?id=663750)
* [Bug 725900](https://bugzilla.gnome.org/show_bug.cgi?id=725900)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1608nvav1enc:Add support for av1 hardware encoding2023-07-24T19:47:01Z谢美龙nvav1enc:Add support for av1 hardware encodingThe 8th generation nvenc has added support for av1 encoding. [Support Matrix](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new).
And rtpav1pay just released in gstreamer 1.21.1. The nvav1enc + rtpav1pay + webr...The 8th generation nvenc has added support for av1 encoding. [Support Matrix](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new).
And rtpav1pay just released in gstreamer 1.21.1. The nvav1enc + rtpav1pay + webrtc will be a nice combination.
[orign issue](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1489)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1259Windows PTP support not integrated into build system2023-06-29T10:00:05ZSebastian DrögeWindows PTP support not integrated into build systemSee https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gstreamer/libs/gst/helpers/meson.build#L30-31
```meson
elif host_system == 'windows'
message('PTP not supported on Windows, not ported yet.')
```
The PTP ...See https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gstreamer/libs/gst/helpers/meson.build#L30-31
```meson
elif host_system == 'windows'
message('PTP not supported on Windows, not ported yet.')
```
The PTP code contains Windows-specific code paths so I assume this was working with the autotools build.
CC @nirbheekSebastian DrögeSebastian Drögehttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3337csharp: Split GStreamer libraries into separate DLLs for bindings2024-02-26T08:55:45ZPiotr Brzezińskicsharp: Split GStreamer libraries into separate DLLs for bindingsAs per discussion in https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5961#note_2258719, ideally each GStreamer library should have a separate DLL containing its own bindings, like GES does now. NuGet packaging can sta...As per discussion in https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5961#note_2258719, ideally each GStreamer library should have a separate DLL containing its own bindings, like GES does now. NuGet packaging can stay as-is.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2941gl: Add I420_10LE support2023-09-19T23:06:19ZNicolas Dufresnegl: Add I420_10LE supportA drive by small enhancement we could do, this is the 10bit format produced by FFMPEG HEVC decoder (at least). It would allow skipping videoconvert for:
```
... ! avdec_h265 ! glimagesink
```
With 10bit files. Please note that unlike P...A drive by small enhancement we could do, this is the 10bit format produced by FFMPEG HEVC decoder (at least). It would allow skipping videoconvert for:
```
... ! avdec_h265 ! glimagesink
```
With 10bit files. Please note that unlike P010, the padding bits are placed in the most significant bits.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2828rtmp2src: no way to know if stream was properly terminated or not on eos2023-08-04T12:24:33ZGuillaume Desmottesrtmp2src: no way to know if stream was properly terminated or not on eosIf the RTMP server closes the connection for any reason, `rtmp2src` will [log the disconnection as INFO](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L927), stop its s...If the RTMP server closes the connection for any reason, `rtmp2src` will [log the disconnection as INFO](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L927), stop its streaming task and [return `GST_FLOW_EOS`](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L614) next time `gst_rtmp2_src_create()` is called.
`EOS` is also returned if the stream has been [properly terminated by the server](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L941).
I need to have a way to distinguish between these two scenarios in my application. In the first case I want to retry connecting to the RTMP server while it the second I terminate the streaming pipeline. Unfortunately there is currently no way to know the actual reason why `EOS` is being returned.
Note that `EOS` is [also returned because of the idle timeout](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-bad/gst/rtmp2/gstrtmp2src.c#L617) but I don't actually use that.
The obvious solution would be to return `GST_FLOW_ERROR` instead of `EOS` on connection error. But I suppose there was a good reason to not do that in the first place? Like some server never properly terminating the stream when it's done?
So maybe this should be controlled using a new property?
@heftig @vivia : you seem to have done some work on this plugin so maybe you know more about why we handle connection error this way?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2643gstplayer: Add gst_player_get_state API2023-06-12T14:47:59ZBugzilla Migration Usergstplayer: Add gst_player_get_state API## Submitted by Lyon
**[Link to original bug (#778379)](https://bugzilla.gnome.org/show_bug.cgi?id=778379)**
## Description
For gstpalyer state, currently we can only get the state by state_change callback, when mainloop start runni...## Submitted by Lyon
**[Link to original bug (#778379)](https://bugzilla.gnome.org/show_bug.cgi?id=778379)**
## Description
For gstpalyer state, currently we can only get the state by state_change callback, when mainloop start running.
However, if we need to get the current state when mainloop has not started running. There is no way to get the gstplayer state.
So considering add this gst_player_get_state() API to get the current player state.
Version: 1.xhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2624flv/rtmp: Add support for "enhanced" RTMP2023-06-23T09:49:53ZSebastian Drögeflv/rtmp: Add support for "enhanced" RTMPSee https://github.com/veovera/enhanced-rtmp . Among other things this adds H265 support. There are patches for ffmpeg already.See https://github.com/veovera/enhanced-rtmp . Among other things this adds H265 support. There are patches for ffmpeg already.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2479clockoverlay: how to overlay a d3d11 surface?2023-06-15T15:19:39ZMaurizio Buratoclockoverlay: how to overlay a d3d11 surface?i need to overlay date and time text on d3d11 surfaces
i tried:
gst-launch-1.0.exe -v d3d11testsrc ! video/x-raw(memory:D3D11Memory) ! clockoverlay halignment=2 valignment=1 time-format="%e %b %Y %H:%M:%S" ! nvd3d11h264enc ! h264parse ...i need to overlay date and time text on d3d11 surfaces
i tried:
gst-launch-1.0.exe -v d3d11testsrc ! video/x-raw(memory:D3D11Memory) ! clockoverlay halignment=2 valignment=1 time-format="%e %b %Y %H:%M:%S" ! nvd3d11h264enc ! h264parse ! fakesink
but i get
WARNING: erroneous pipeline: could not link clockoverlay0 to nvd3d11h264enc0
is there another method?
thankshttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2443gst-rtsp-server: Decrease RTSP setup time2023-10-02T17:14:50ZPatricia Muscalugst-rtsp-server: Decrease RTSP setup timeProblem:<br>
Setting suspend mode RESET on an RTSP media brings the live pipeline to NULL state on suspend.
When a PLAY request arrives, the pipeline is unsuspended and set to the PLAYING state. The source element has to release and reop...Problem:<br>
Setting suspend mode RESET on an RTSP media brings the live pipeline to NULL state on suspend.
When a PLAY request arrives, the pipeline is unsuspended and set to the PLAYING state. The source element has to release and reopen stream
resources and this operation, on our case, is time consuming due to device resource limits.
Solution:<br>
Introduce a new suspend mode, that doesn't change the state of the media on suspend, meaning that the media is still PLAYING but remains blocked. When the media is unsuspended, the GstForceKeyUnit event is sent to the pipeline and the pipeline is flushed. The new generated key frame will unblock the media.
The patch will be provided.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2384urisourcebin: src pads have no caps when added2023-03-20T13:45:32ZGuillaume Desmottesurisourcebin: src pads have no caps when addedI had to use `urisourcebin parse-streams=true` manually and it's been trickier than anticipated. One cannot just connect the `pad-added` signal, check the pad caps and add the elements to handle the stream as the pad at this point has no...I had to use `urisourcebin parse-streams=true` manually and it's been trickier than anticipated. One cannot just connect the `pad-added` signal, check the pad caps and add the elements to handle the stream as the pad at this point has no caps yet. Because of that I had to setup a blocking probe and wait for the caps event, which can be tricky if you're not so used with GStreamer.
Would it make sense to wait for the caps before exposing the `src` pads and so make all this more convenient for users?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1750va: Add support for libva-win322023-03-10T15:06:28ZSil Vilerinova: Add support for libva-win32Since [libva 2.17](https://github.com/intel/libva/releases/tag/2.17.0) there's a new VAAPI display "libva-win32" that allows VA acceleration on Windows OS. Similarly, [Mesa 22.3](https://docs.mesa3d.org/relnotes/22.3.0.html) released `va...Since [libva 2.17](https://github.com/intel/libva/releases/tag/2.17.0) there's a new VAAPI display "libva-win32" that allows VA acceleration on Windows OS. Similarly, [Mesa 22.3](https://docs.mesa3d.org/relnotes/22.3.0.html) released `vaon12_drv_video.dll`, a VA driver for Windows, which is based on D3D12 Video APIs and implements the following entrypoints (where hardware/drivers support is available) on Windows OS:
```
VAProfileH264ConstrainedBaseline: VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointEncSlice
VAProfileH264High : VAEntrypointVLD
VAProfileH264High : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointVLD
VAProfileHEVCMain : VAEntrypointEncSlice
VAProfileHEVCMain10 : VAEntrypointVLD
VAProfileHEVCMain10 : VAEntrypointEncSlice
VAProfileVP9Profile0 : VAEntrypointVLD
VAProfileVP9Profile2 : VAEntrypointVLD
VAProfileAV1Profile0 : VAEntrypointVLD
VAProfileNone : VAEntrypointVideoProc
```
Enhancing/extending gstreamer with the VAAPI `VADisplay` initialization code using the new `vaGetDisplayWin32` function would allow the vaapi* related plugins to work on Windows. Using it together with the `vaon12` mesa driver, it'd allow for access to a (layered via libva) D3D12 video support on gstreamer for decode, encode and some video processing as well on any device supporting D3D12 Video APIs. Please note the vaapi* plugins are already working via `libd3d12.so` and the `d3d12_drv_video.so` mesa driver in `Windows Subsystem for Linux` via the existing VAAPI DRM/X11 initialization paths.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1691configuration of sink proxy with Gstreamer2023-01-10T18:54:40ZStefan Lorenzconfiguration of sink proxy with Gstreamer1. Is it possible to run the rtsps URL (rtspclientsink location=rtsps://...) through a proxy with Gstreamer?
For example I would like to use the following Gstreamer pipeline:
`gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/liv...1. Is it possible to run the rtsps URL (rtspclientsink location=rtsps://...) through a proxy with Gstreamer?
For example I would like to use the following Gstreamer pipeline:
`gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/live protocols=tcp ! queue ! rtph264depay ! rtspclientsink location=rtsps://client78558:6a8d99a2@27c995.entrypoint.cloud.wowza.com:443/app-866S2P81/37a17990 protocols=tcp`
2. If yes, with which proxy can it be implemented ?
For example, a proxy with rtspsrc url works with Gstreamer. Here, I have used [live555proxyserver](https://hub.docker.com/r/migoller/live555proxyserverdocker) in a Docker environment.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1495mxfdemux: can't handle Canon XF705 H265 10bit MXF video2022-10-12T10:07:48ZDavid Manpearlmxfdemux: can't handle Canon XF705 H265 10bit MXF videoI have been unable to decode H265 10-bit formats from Canon XF705.
My goal is to transcode into H264 with a GStreamer pipeline something like this:
```
gst-launch-1.0 filesrc location=A003C002H1901045W_CANON.MXF ! qtdemux ! h265parse ! a...I have been unable to decode H265 10-bit formats from Canon XF705.
My goal is to transcode into H264 with a GStreamer pipeline something like this:
```
gst-launch-1.0 filesrc location=A003C002H1901045W_CANON.MXF ! qtdemux ! h265parse ! avdec_h265 ! videoconvert ! videoscale ! video/x-raw,width=1280,height=720 ! x264enc ! h264parse ! queue ! mp4mux ! filesink location=video.mp4
Setting pipeline to PAUSED, PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: This file is invalid and cannot be played.
Additional debug info:
qtdemux.c(747): gst_qtdemux_pull_atom (): /GstPipeline:pipeline0/GstQTDemux:qtdemux0:
atom has bogus size 101591860
ERROR: pipeline doesn't want to preroll.
```
I have had similar errors with decodebin, qtdemux, mxfdemux, and avdemux_mxf.
Example file download: https://www.dropbox.com/sh/q5m7cxgneq5z5h3/AAAh-d3FdhouZ2bFv2FajR18a?dl=0
*(note: Adobe Premiere can decode this file. Quicktime and VLC cannot)*
Here are some stats on the file:
```
gst-launch-1.0 --version
gst-launch-1.0 version 1.14.4
GStreamer 1.14.4
```
**ffprobe**:
```
ffprobe -hide_banner -show_format -show_streams -print_format json A003C002H1901045W_CANON.MXF
Metadata:
uid : 3b6f6487-8405-4901-802e-242719000075
generation_uid : 3b6f6487-8405-4903-802e-242719000075
company_name : CANON
product_name : XF705
product_version : 1.00
product_uid : 060e2b34-0401-010d-0e15-005658460400
Duration: 00:00:16.02, start: 0.000000, bitrate: 157466 kb/s
Stream #0:0: Video: none, none(progressive), 3840x2160, SAR 1:1 DAR 16:9,
9.94 fps, 59.94 tbr, 59.94 tbn, 59.94 tbc
"codec_type": "video",
"codec_tag": "0x0000",
"width": 3840,
"height": 2160,
"has_b_frames": 0,
"r_frame_rate": "60000/1001",
```
**mediainfo**:
```
mediainfo A003C002H1901045W_CANON.MXF
Format : MXF
Format version : 1.3
Format profile : OP-1a
Writing application : CANON XF705 1.00
Codec ID : 0E15000402100001-0E15000500013000
Color space : YUV
Chroma subsampling : 4:2:2
Bit depth : 10 bits
Scan type : Progressive
```
**Tim Müller suggested**:
This is what I get:
```
$ gst-play-1.0 A003C002H1901045W_CANON.MXF
WARNING No decoder available for type 'video/x-mxf-
06.0e.2b.34.04.01.01.0c.0e.15.00.04.02.10.00.01-
06.0e.2b.34.04.01.01.0c.0e.15.00.05.00.01.30.00'.
WARNING debug information: gsturidecodebin.c(921): unknown_type_cb ():
/GstPlayBin:playbin/GstURIDecodeBin:uridecodebin0
```
[TM] "I think we're just missing mappings for H265 in mxfdemux from the looks
of it."https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1278Add support for windows application loopback audio capture to specify/exclude...2022-10-18T00:23:45ZDániel HorváthAdd support for windows application loopback audio capture to specify/exclude process idThis is a feature request.<br/>
Currently there is no way to capture/exclude application specific audio in windows.<br/>
Windows supports this, but wasapisrc and wasapi2src don't take the TargetProcessId and ProcessLoopbackMode parameter...This is a feature request.<br/>
Currently there is no way to capture/exclude application specific audio in windows.<br/>
Windows supports this, but wasapisrc and wasapi2src don't take the TargetProcessId and ProcessLoopbackMode parameters.<br/>
Related: [https://docs.microsoft.com/en-us/samples/microsoft/windows-classic-samples/applicationloopbackaudio-sample/](https://docs.microsoft.com/en-us/samples/microsoft/windows-classic-samples/applicationloopbackaudio-sample/)
[https://github.com/microsoft/windows-classic-samples/tree/main/Samples/ApplicationLoopback](https://github.com/microsoft/windows-classic-samples/tree/main/Samples/ApplicationLoopback)
Please, add support for this.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1250Follow-up from "va: vpp: add compositor"2022-10-10T11:05:22ZVíctor Manuel Jáquez LealFollow-up from "va: vpp: add compositor"The following discussion from !2332 should be addressed:
- [ ] @He_Junyan started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2332#note_1403622):
> @vjaquez , I think we need to enable this p...The following discussion from !2332 should be addressed:
- [ ] @He_Junyan started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2332#note_1403622):
> @vjaquez , I think we need to enable this property in vapostproc element. It does not make sense that the vacompositor can do this while the vapostproc can not.U. Artie EoffU. Artie Eoffhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1167va: Multi-GPU support in a process2022-11-05T11:55:35ZSeungha Yangseungha@centricular.comva: Multi-GPU support in a processcc @vjaquez @He_Junyan
I've tested multi-Intel-gpu environment on Windows with d3d11 and/or qsv plugin (I am not saying inter-gpu processing, just mutiple GPU objects are flowing per dedicated branch in a pipeline) that works pretty we...cc @vjaquez @He_Junyan
I've tested multi-Intel-gpu environment on Windows with d3d11 and/or qsv plugin (I am not saying inter-gpu processing, just mutiple GPU objects are flowing per dedicated branch in a pipeline) that works pretty well and as expected.
But I am not able to find muti-gpu consideration from gstva. Clearly MSDK has no consideration for the muti-GPU.
I'm wondering such scenario on Linux is possible, or has a plan to support