GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2022-06-21T08:59:01Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1295qtdemux: Unexpected segment end detected (EOS) in push mode (live stream, fmp4)2022-06-21T08:59:01ZAndrzej Surdejqtdemux: Unexpected segment end detected (EOS) in push mode (live stream, fmp4)### Describe your issue
The original issue comes from Vevo application live video playback that is interrupted by unexpected EOS (end of segment) detected from qtdemux. The issue exists with push mode only (pull mode doesn't detect EOS i...### Describe your issue
The original issue comes from Vevo application live video playback that is interrupted by unexpected EOS (end of segment) detected from qtdemux. The issue exists with push mode only (pull mode doesn't detect EOS in my case).
The init segment contains duration of 1min(60sec) and with each moof it is extended with "timestamp" (from qtdemux_parse_trun() func) that is fragment decode time (tfdt) or sample dts and the sum of all samples duration. With such calculations the segment stop time is expressed in dts timeline.
Then processing the data (gst_qtdemux_process_adapter() -> QTDEMUX_STATE_MOVIE) tries to detect the end of segment based on PTS value for each frame that is higher value than dts and passes segment stop time:
` /* check for segment end */
if (G_UNLIKELY (demux->segment.stop != -1
&& demux->segment.stop <= pts && stream->on_keyframe)
&& !(demux->upstream_format_is_time && demux->segment.rate < 0)) {
GST_DEBUG_OBJECT (demux, "we reached the end of our segment.");
stream->time_position = GST_CLOCK_TIME_NONE; /* this means EOS */
`
The issue is barely visible because of additional "on_keyframe" check that suppose to cut segment on keyframes only (so the decoder has all data needed).
The difference why I hit this issue is that my stream has more keyframes (couple in single fragment/moof/mdat) and the EOS is thrown before reaching the end acutually (like 3 samples earlier, depending on dts<->pts shift)
#### Expected Behavior
Video playback should continue without EOS detected
#### Observed Behavior
EOS is thrown from qtdemux and the playback is stopped.
#### Setup
- **Operating System:** Ubuntu 20.04
- **Device:** Computer
- **GStreamer Version:** GStreamer 1.16.2 and GStreamer 1.21.0 (GIT)
- **Command line:** GST_DEBUG="qtdemux:5,2" gst-launch-1.0 pushfilesrc location=<path>/video_8 ! qtdemux ! appsink
### Steps to reproduce the bug
<!-- please fill in exact steps which reproduce the bug on your system, for example: -->
0. Fetch attached video sample (dump from VEVO live channel): [video_8](/uploads/7a80e18f1686084f21cafb6578ef3bf8/video_8)
1. open terminal
3. type `GST_DEBUG="qtdemux:5,2" gst-launch-1.0 pushfilesrc location=<path>/video_8 ! qtdemux ! appsink`
4. Parsing ends with EOS thrown from qtdemux -> segment end detected
### How reproducible is the bug?
Always
### Screenshots if relevant
### Solutions you have tried
The solution for this case would be to replace PTS with DTS when checking for the segment stop. It would make EOS detection more reliable and based on the same timestamp values. This however may interrupt seeking with stop time set (as it's PTS based I believe).
### Related non-duplicate issues
### Additional Information
I have more dumps from the same content if neededhttps://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/102iOS tutorial 4 &5: Undefined symbol: Undefined symbol: _gst_plugin_mms_register2022-06-20T18:50:23ZfjmaxiOS tutorial 4 &5: Undefined symbol: Undefined symbol: _gst_plugin_mms_registerHello,
I'm trying to build the ios example project in the master branch from this repo, tutorial 1,2,3 builds fine but 4 and 5 both got the same error `Undefined symbol: _gst_plugin_mms_register`.
My environment:
* XCode: 13.3.1
* Gstr...Hello,
I'm trying to build the ios example project in the master branch from this repo, tutorial 1,2,3 builds fine but 4 and 5 both got the same error `Undefined symbol: _gst_plugin_mms_register`.
My environment:
* XCode: 13.3.1
* Gstreamer binaries for ios: 1.20.2
* OS: 12.3.1 Monterey
--
I notice the latest commit for the master branch is ce5b31a1b0b953dbc147a7ab564d656739cdc792 which is for release 1.19.2, could this be the reason why the symbol couldn't be found?
Or is there anything I can do to provide more information?https://gitlab.freedesktop.org/gstreamer/gstreamer-sharp/-/issues/63gstreamer-sharp: An unhandled exception of type 'System.Exception' occurred i...2022-06-20T15:13:19ZConnor Hawesgstreamer-sharp: An unhandled exception of type 'System.Exception' occurred in GLibSharp.dll Unknown type GstRTSPMessageCurrently trying to handle the before-send rtspsrc signal in order to add headers to RTSP messages.
Upon the handler being triggered the titular message is thrown.
Here is my current handler:
`
private void BeforeSendCb(object s...Currently trying to handle the before-send rtspsrc signal in order to add headers to RTSP messages.
Upon the handler being triggered the titular message is thrown.
Here is my current handler:
`
private void BeforeSendCb(object sender, SignalArgs args)
{
Gst.Rtsp.RTSPMessage message = (Gst.Rtsp.RTSPMessage)args.Args[0];
message.AddHeader(Gst.Rtsp.RTSPHeaderField.CompanyId, "CONNOR HAWES");
Trace.WriteLine("BREAK HERE");
}
`
And the subscription:
`
_rtspsrc.Connect("before-send", BeforeSendCb);
`
I was wondering if I was doing anything wrong or if there is a workaround?
Thanks in advancehttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1292HDR10 CLL metadata is allowed to change2022-06-20T13:30:23ZValZapodHDR10 CLL metadata is allowed to changeContrary to popular belief HDR10 static metadata can change per segment on seamless branching Blu-rays (ripping of which was also very broken due to wrong TrueHD duplicated frames deletion on segment border) and it is also allowed to cha...Contrary to popular belief HDR10 static metadata can change per segment on seamless branching Blu-rays (ripping of which was also very broken due to wrong TrueHD duplicated frames deletion on segment border) and it is also allowed to change in mp4. As ffmpeg mail says (right now [last HDR metadata patches](https://patchwork.ffmpeg.org/project/ffmpeg/list/?series=3373) are being accepted):
https://www.mail-archive.com/ffmpeg-devel@ffmpeg.org/msg108319.html
> in that some formats allow use to set it on frame,
> some on the stream, some on the container
> I notice that color data can be set per frame and per stream already
> and I don’t fully understand how these interact if converting between data in
> frame (e.g HEVC SEI in stream in hev1) or data in header (e.g. MOV mdcv tag or
> HEVC SEI in hvc1 format).
And this is correct. Free CTA-861-G (and now -H) standard mandate that and I quote:
> If the Source supports the transmission of the Dynamic Range and Mastering InfoFrame and if it
> determines that the Sink is capable of receiving that information, the Source shall send the Dynamic
> Range and Mastering InfoFrame **once per Video Field**
What that means is that it is allowed to change and display should be fast enough to change it on the fly.
In fact there is a sample from "In the Heart of the Sea" movie https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/uploads/16c628c535865d7282a48317064345a2/out.mp4 that utilizes that.
It can cause all kind of issues for you. mpv does support it and so does MadVR.https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/727HDR: MDCV and CLL should not be exposed as a caps2022-06-20T13:21:47ZStéphane Cerveauscerveau@igalia.comHDR: MDCV and CLL should not be exposed as a capsIn the case of HDR10+ dynamic meta (see #700), the GOPs or scene can change MDCV and CLL values during the stream playback.
A GstVideoMeta approach should be considered instead of caps.
Only the type of HDR (HDR10, HDR10+, DolbyVision...In the case of HDR10+ dynamic meta (see #700), the GOPs or scene can change MDCV and CLL values during the stream playback.
A GstVideoMeta approach should be considered instead of caps.
Only the type of HDR (HDR10, HDR10+, DolbyVision) should be exposed by caps allowing downstream element to adapt and optimize data retrieval.Stéphane Cerveauscerveau@igalia.comStéphane Cerveauscerveau@igalia.comhttps://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/875HDR10 CLL metadata is allowed to change2022-06-20T13:13:59ZValZapodHDR10 CLL metadata is allowed to changeContrary to popular belief HDR10 static metadata can change per segment on seamless branching Blu-rays (ripping of which was also very broken due to wrong TrueHD duplicated frames deletion on segment border) and it is also allowed to cha...Contrary to popular belief HDR10 static metadata can change per segment on seamless branching Blu-rays (ripping of which was also very broken due to wrong TrueHD duplicated frames deletion on segment border) and it is also allowed to change in mp4. As ffmpeg mail says (right now [last HDR metadata patches](https://patchwork.ffmpeg.org/project/ffmpeg/list/?series=3373) are being accepted):
https://www.mail-archive.com/ffmpeg-devel@ffmpeg.org/msg108319.html
> in that some formats allow use to set it on frame,
> some on the stream, some on the container
> I notice that color data can be set per frame and per stream already
> and I don’t fully understand how these interact if converting between data in
> frame (e.g HEVC SEI in stream in hev1) or data in header (e.g. MOV mdcv tag or
> HEVC SEI in hvc1 format).
And this is correct. Free CTA-861-G (and now -H) standard mandate that and I quote:
> If the Source supports the transmission of the Dynamic Range and Mastering InfoFrame and if it
> determines that the Sink is capable of receiving that information, the Source shall send the Dynamic
> Range and Mastering InfoFrame **once per Video Field**
What that means is that it is allowed to change and display should be fast enough to change it on the fly.
In fact there is a sample from "In the Heart of the Sea" movie https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/uploads/16c628c535865d7282a48317064345a2/out.mp4 that utilizes that.
It can cause all kind of issues for you. mpv does support it and so does MadVR.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1264AES-GCM to support RFC3711 in the Gstreamer2022-06-20T07:00:43ZChandrasekhar JharaplaAES-GCM to support RFC3711 in the GstreamerI am unable to encode and decode the stream with libsrtp
gst-launch-1.0 udpsrc port=5200 caps='application/x-srtp, encoding-name=JPEG,
ssrc=(uint)1356955624,\
srtp-key=(buffer)012345678901234567890123456789012345678901234567890123456789...I am unable to encode and decode the stream with libsrtp
gst-launch-1.0 udpsrc port=5200 caps='application/x-srtp, encoding-name=JPEG,
ssrc=(uint)1356955624,\
srtp-key=(buffer)012345678901234567890123456789012345678901234567890123456789,\
srtp-cipher=(string)aes-128-gcm, \
srtcp-cipher=(string)aes-128-gcm, roc=(uint)0'\
! srtpdec ! rtpjpegdepay ! jpegdec ! autovideosink
with the above pipe line I am not encode the data as it is getting encoder errorhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1141GstVideoInfo chroma_site is wrong for YUV444 formats (e.g. Y444_LE)2022-06-18T21:34:07ZAdrien De ConinckGstVideoInfo chroma_site is wrong for YUV444 formats (e.g. Y444_LE)GstVideoInfo chroma_site information should be GST_VIDEO_CHROMA_SITE_JPEG for YUV444 formats (e.g.Y444_LE)
Due to [this condition](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-base/gst-libs/gst/...GstVideoInfo chroma_site information should be GST_VIDEO_CHROMA_SITE_JPEG for YUV444 formats (e.g.Y444_LE)
Due to [this condition](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-base/gst-libs/gst/video/video-info.c#L169), it is set to GST_VIDEO_CHROMA_SITE_MPEG2 whatever the YUV format.
```
static void
set_default_colorimetry (GstVideoInfo * info)
{
const GstVideoFormatInfo *finfo = info->finfo;
if (GST_VIDEO_FORMAT_INFO_IS_YUV (finfo)) {
if (info->height > 576) {
info->chroma_site = GST_VIDEO_CHROMA_SITE_H_COSITED;
info->colorimetry = default_color[DEFAULT_YUV_HD];
} else {
info->chroma_site = GST_VIDEO_CHROMA_SITE_NONE;
info->colorimetry = default_color[DEFAULT_YUV_SD];
}
} else if (GST_VIDEO_FORMAT_INFO_IS_GRAY (finfo)) {
info->colorimetry = default_color[DEFAULT_GRAY];
} else if (GST_VIDEO_FORMAT_INFO_IS_RGB (finfo)) {
info->colorimetry = default_color[DEFAULT_RGB];
} else {
info->colorimetry = default_color[DEFAULT_UNKNOWN];
}
}
```
Here is a gst-launch-1.0 command line to run to get the "error".
```
gst-launch-1.0.exe videotestsrc ! "video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)60/1, format=Y444_16LE" ! autovideosink
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1290gstreamer-sharp: Gst.Application requires WebRTC (gst-plugins-bad) by default2022-06-17T23:28:41ZPedro Castrogstreamer-sharp: Gst.Application requires WebRTC (gst-plugins-bad) by defaultThe static constructor in Gst.Application registers WebRTC by default:
```csharp
GLib.GType.Register(Gst.WebRTC.WebRTCSessionDescription.GType, typeof(Gst.WebRTC.WebRTCSessionDescription));
```
This implies that `gst-plugins-bad`, whic...The static constructor in Gst.Application registers WebRTC by default:
```csharp
GLib.GType.Register(Gst.WebRTC.WebRTCSessionDescription.GType, typeof(Gst.WebRTC.WebRTCSessionDescription));
```
This implies that `gst-plugins-bad`, which contains WebRTC, will be required to run gstreamer-sharp.
Wondering if this was a design decision. I've switched to gstreamer-sharp in Gnome Subtitles and WebRTC isn't used, so would rather prefer not to force users to install plugins-bad unnecessarily.https://gitlab.freedesktop.org/gstreamer/gst-libav/-/issues/98Playing ALAC files and generating MP4 thumbnails no longer working in version...2022-06-17T19:17:35ZMarc RanolfiPlaying ALAC files and generating MP4 thumbnails no longer working in versions newer than 1.18.3I have Arch Linux and Cinnamon with its default Nemo file manager.
Since upgrading from version 1.18.3 to the 1.18.4 series, I noticed that Gnome-videos (totem) and Rhythmbox would no longer play my ALAC files. In addition, Nemo is unab...I have Arch Linux and Cinnamon with its default Nemo file manager.
Since upgrading from version 1.18.3 to the 1.18.4 series, I noticed that Gnome-videos (totem) and Rhythmbox would no longer play my ALAC files. In addition, Nemo is unable to generate thumbnails for my MP4 video files.
I have isolated this issue by downgrading all gstreamer packages until it was finally resolved by downgrading gst-libav to version [1.18.3-1 from the Arch Linux archive](https://archive.archlinux.org/packages/g/gst-libav/gst-libav-1.18.3-1-x86_64.pkg.tar.zst).
What can possibly be going on and how can I help in debugging this?
Thanks.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1289iOS GstPlay - Can't receive any signal2022-06-17T14:17:41ZbudainiOS GstPlay - Can't receive any signal### Describe your issue
I would like to receive some signal from `GstPlay`: [see documentation](https://gstreamer.freedesktop.org/documentation/play/gst-libs/gst/play/gstplay-types.html?gi-language=c#GstPlaySignalAdapter::state-changed)
...### Describe your issue
I would like to receive some signal from `GstPlay`: [see documentation](https://gstreamer.freedesktop.org/documentation/play/gst-libs/gst/play/gstplay-types.html?gi-language=c#GstPlaySignalAdapter::state-changed)
#### Expected Behavior
My function should be call and print my logs
#### Setup
- **Operating System:** iOS
- **Device:** Mobile
- **GStreamer Version:** 1.20.2
- **Command line:** `g_signal_connect (adapter, "state-changed", G_CALLBACK(stateChangedCb), NULL);`
### Steps to reproduce the bug
```
static GstPlay *player;
static GstPlaySignalAdapter *adapter;
-(instancetype)init {
self = [super init];
if (!player) {
[self configurePlayer];
}
return self;
}
-(void)configurePlayer {
GstreamerConfiguration(); // Like gst_ios_init
player = gst_play_new(NULL);
adapter = gst_play_signal_adapter_new(player);
gst_play_config_set_seek_accurate(gst_play_get_config(player), true);
[self configureCallBacks];
}
-(void)configureCallBacks {
NSLog(@"---- callllled");
g_signal_connect (adapter, "state-changed", G_CALLBACK(stateChangedCb), NULL);
}
void stateChangedCb (GstPlaySignalAdapter *adapter, GstPlayState *state, void *data) {
NSLog(@"---- NEW STATE");
}
```
The log `---- NEW STATE` should be print .
Do you have any idea ?https://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/104iOS GstPlay - Can't receive any signal2022-06-17T09:42:18ZbudainiOS GstPlay - Can't receive any signalI would like to use `GstPlay` to play some audio file from a remote URL. I can `play`, `pause`, `seek`..
I want to listen some event: `state_changed` ([see doc](https://gstreamer.freedesktop.org/documentation/play/gst-libs/gst/play/gstpl...I would like to use `GstPlay` to play some audio file from a remote URL. I can `play`, `pause`, `seek`..
I want to listen some event: `state_changed` ([see doc](https://gstreamer.freedesktop.org/documentation/play/gst-libs/gst/play/gstplay-types.html?gi-language=c#GstPlaySignalAdapter::state-changed))
I don't understand why don't receive any signal:
My code:
```
static GstPlaySignalAdapter *adapter;
-(instancetype)init {
self = [super init];
if (!monoPlayer) {
[self configurePlayer];
}
return self;
}
-(void)configurePlayer {
GstreamerConfiguration(); // Like gst_ios_init
player = gst_play_new(NULL);
adapter = gst_play_signal_adapter_new(player);
gst_play_config_set_seek_accurate(gst_play_get_config(player), true);
[self configureCallBacks];
}
-(void)configureCallBacks {
NSLog(@"---- callllled");
g_signal_connect (adapter, "state-changed", G_CALLBACK(stateChangedCb), NULL);
}
void stateChangedCb (GstPlaySignalAdapter *adapter, GstPlayState *state, void *data) {
NSLog(@"---- NEW STATE");
// gst_print ("State changed: %s\n", gst_play_state_get_name (state));
}
```
I use GStreamer `1.20.2`https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1288Unsafe eglimage cache usage in gstglupload.c in combination with tee2022-06-17T06:57:29ZErik De RijckeUnsafe eglimage cache usage in gstglupload.c in combination with tee`gstglupload.c` with tee calls `_set_cached_eglimage` and potentially frees the previously stored eglimage pointer, and later triggers critical warnings when the eglimage is unreffed again & potentially causes segfaults.
From gstreamer ...`gstglupload.c` with tee calls `_set_cached_eglimage` and potentially frees the previously stored eglimage pointer, and later triggers critical warnings when the eglimage is unreffed again & potentially causes segfaults.
From gstreamer irc:
```
00:55 < ndufresne> zubzub: ah, the cache is still not tee safe?
00:56 < ndufresne> We've fixed many cache over the time, so I can't remember if that one is still broken... But you can't cache on the buffer itself, that's not safe
```
Tested on gstreamer 1.20.1https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/971SIGSEGV in subprojects/gst-plugins-base/gst/playback/gstparsebin.c during HLS...2022-06-16T22:41:38ZBill HofmannSIGSEGV in subprojects/gst-plugins-base/gst/playback/gstparsebin.c during HLS stream using souphttpsrcUprev'd from 1.20.1 with two local changes in kms to 1.20.3, now 100% reproducible SIGSEGV while playing graph that works extremely reliably in 1.20.1. First sign was my python impl getting buffering messages (which it never had before)...Uprev'd from 1.20.1 with two local changes in kms to 1.20.3, now 100% reproducible SIGSEGV while playing graph that works extremely reliably in 1.20.1. First sign was my python impl getting buffering messages (which it never had before), but followed by crash. Repro'd outside my program, with reduced M3U8 file.
Build/Env:
- Ubuntu 21.10
- `meson -Dvaapi=enabled -Dbad=enabled -Dugly=enabled builddir`
Command (after ninja -C builddir devenv)
```
GST_DEBUG_FILE=./crash.txt GST_DEBUG=6 gdb --args gst-launch-1.0 \
souphttpsrc location="http://127.0.0.1/1D2E355A-BD2C-4807-91EA-4C2236D2DBED/0/short.m3u8" ! parsebin name=pb ! queue ! vaapih265dec ! \
video/x-raw,format=P010_10LE ! queue max-size-bytes=100663300 ! kmssink connector-id=78 plane-id=55
```
I set a `catch signal SIGSEGV` and here is the tail of the GDB output:
```
[New Thread 0x7fffbd7fa640 (LWP 1624)]
Pipeline is PREROLLING ...
[New Thread 0x7fffbcff9640 (LWP 1625)]
Got context from element 'vaapidecode_h265-0': gst.vaapi.Display=context, gst.vaapi.Display=(GstVaapiDisplay)"\(GstVaapiDisplayDRM\)\ vaapidisplaydrm1", gst.vaapi.Display.GObject=(GstObject)"\(GstVaapiDisplayDRM\)\ vaapidisplaydrm1";
Got context from element 'souphttpsrc0': gst.soup.session=context, session=(GstSoupSession)NULL;
[New Thread 0x7fff9ffff640 (LWP 1626)]
[New Thread 0x7fff9f7fe640 (LWP 1627)]
[New Thread 0x7fff9effd640 (LWP 1628)]
[New Thread 0x7fff9e7fc640 (LWP 1629)]
[New Thread 0x7fff9dffb640 (LWP 1630)]
[New Thread 0x7fff9d7fa640 (LWP 1631)]
[Switching to Thread 0x7fff9d7fa640 (LWP 1631)]
Thread 60 "task3" hit Catchpoint 1 (signal SIGSEGV), gst_parse_chain_expose (chain=chain@entry=0x7fff88047ae0, endpads=endpads@entry=0x7fff9d7f9340, missing_plugin=missing_plugin@entry=0x7fff9d7f932c, missing_plugin_details=missing_plugin_details@entry=0x7fff80002c00, last_group=last_group@entry=0x7fff9d7f9330, uncollected_streams=uncollected_streams@entry=0x7fff9d7f9334) at ../subprojects/gst-plugins-base/gst/playback/gstparsebin.c:3777
3777 if (p->active_stream && p->active_collection == NULL
(gdb) frame
#0 gst_parse_chain_expose (chain=chain@entry=0x7fff88047ae0, endpads=endpads@entry=0x7fff9d7f9340, missing_plugin=missing_plugin@entry=0x7fff9d7f932c,
missing_plugin_details=missing_plugin_details@entry=0x7fff80002c00, last_group=last_group@entry=0x7fff9d7f9330, uncollected_streams=uncollected_streams@entry=0x7fff9d7f9334)
at ../subprojects/gst-plugins-base/gst/playback/gstparsebin.c:3777
3777 if (p->active_stream && p->active_collection == NULL
(gdb) up
#1 0x00007ffff4c3a8e6 in gst_parse_chain_expose (chain=<optimized out>, endpads=endpads@entry=0x7fff9d7f9340, missing_plugin=missing_plugin@entry=0x7fff9d7f932c,
missing_plugin_details=missing_plugin_details@entry=0x7fff80002c00, last_group=last_group@entry=0x7fff9d7f9330, uncollected_streams=uncollected_streams@entry=0x7fff9d7f9334)
at ../subprojects/gst-plugins-base/gst/playback/gstparsebin.c:3788
3788 ret |= gst_parse_chain_expose (childchain, endpads, missing_plugin,
(gdb)
```
Output of GST is attached: [ourcrash.txt](/uploads/b880eb7237fd738f6b0ebea8155d6749/ourcrash.txt).
Note this version runs just fine using filesrc of a different MP4 instead of the souphttpsrc.
Just for jollies, here's the M3U8: [short.m3u8](/uploads/12aaaa568f4294cd313cea4ee0857f10/short.m3u8)
I can provide the TS files, but they're kinda big. Happy to attach if desired.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1272wasapi2: cannot get device provider factory "wasapi2deviceprovider"2022-06-16T12:41:42ZCedric GNIEWEKwasapi2: cannot get device provider factory "wasapi2deviceprovider"### Describe your issue
Getting the wasapi2 device provider doesn't seem to work properly.
#### Expected Behavior
Calling `auto provider = gst_device_provider_factory_get_by_name("wasapi2deviceprovider");` returns the wasapi2 device pro...### Describe your issue
Getting the wasapi2 device provider doesn't seem to work properly.
#### Expected Behavior
Calling `auto provider = gst_device_provider_factory_get_by_name("wasapi2deviceprovider");` returns the wasapi2 device provider.
#### Observed Behavior
Calling `auto provider = gst_device_provider_factory_get_by_name("wasapi2deviceprovider");` returns null and I get this message in the log :
> GST_DEVICE_PROVIDER_FACTORY gstdeviceproviderfactory.c:377:gst_device_provider_factory_get_by_name: no such device provider factory "wasapi2deviceprovider"!
The plugin seems to have loaded properly because, before that, I got this message when a wasapi2sink was created :
> GST_PLUGIN_LOADING gstplugin.c:984:_priv_gst_plugin_load_file_for_registry: plugin "I:\gstreamer\1.0\msvc_x86_64\lib\gstreamer-1.0\gstwasapi2.dll" loaded
>
> GST_ELEMENT_FACTORY gstelementfactory.c:489:gst_element_factory_create_with_properties: creating element "wasapi2sink"
#### Setup
- **Operating System:** Windows 11
- **Device:** Computer
- **GStreamer Version:** 1.20.2
- **Command line:** None, C++ application
### Steps to reproduce the bug
### How reproducible is the bug?
Always
### Screenshots if relevant
### Solutions you have tried
### Related non-duplicate issues
### Additional Information
Using the same function with the "wasapideviceprovider" parameter does work.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1276Capture specific window using d3d11screencapturesrc2022-06-16T11:06:05ZAlireza MiryazdiCapture specific window using d3d11screencapturesrcIs it possible to capture only a specific window when using d3d11screencapturesrc? As far as i know it can only capture a full monitor display for now.
If not could you hint me at where i can start with adding this functionality? I'm sti...Is it possible to capture only a specific window when using d3d11screencapturesrc? As far as i know it can only capture a full monitor display for now.
If not could you hint me at where i can start with adding this functionality? I'm still quite new to this and there's a lot to go through but I'd like to try at least.
Thank you in advance.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1188A question about changing elements in a pipeline2022-06-16T08:22:05ZYuhao SunA question about changing elements in a pipelineHi, I am a green hand about gstreamer and I have a question when I changing some elements in the running pipeline:
https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=p...Hi, I am a green hand about gstreamer and I have a question when I changing some elements in the running pipeline:
https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html?gi-language=python
The above tutorial shows me that to change a element in the pipeline, I should send EOS to the sink pad and wait for the EOS event to appear on the source pad. But how could I change the final element without source pad in the pipeline? such as replacing the filesink to fakesink to stop the recording and save all data into file in my program. Are there any way is a recommended?
By the way, After changing the element successfully, is it necessary to remove all idle elements from the bin using gst_bin_remove just like in the tutorial? I tried to do not it and usually find some strange problem.
I am looking for your reply.
Kind Regards,
Yuhaohttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1285iOS: Cannot retrieve width & height via "check_media_size" method2022-06-15T16:35:04ZSiegbaertiOS: Cannot retrieve width & height via "check_media_size" methodHello GStreamer community,
I am developing a native iOS application, which should be capable of displaying a live videostream.
My code is based on the official [GStreamer iOS tutorials](https://gstreamer.freedesktop.org/documentation/tu...Hello GStreamer community,
I am developing a native iOS application, which should be capable of displaying a live videostream.
My code is based on the official [GStreamer iOS tutorials](https://gstreamer.freedesktop.org/documentation/tutorials/ios/index.html?gi-language=c).
The stream itself works. It is displayed and I am quite satisfied with resolution, FPS and latency of the stream.
But I am not able to get the following method - which was taken from the [tutorials](https://gstreamer.freedesktop.org/documentation/tutorials/ios/a-basic-media-player.html?gi-language=c) - to return the expected result:
```objc
static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink;
GstPad *video_sink_pad;
GstCaps *caps;
GstVideoInfo info;
/* Retrieve the Caps at the entrance of the video sink */
g_object_get (self->pipeline, "video-sink", &video_sink, NULL);
video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_current_caps (video_sink_pad);
if (gst_video_info_from_caps (&info, caps)) {
NSLog(@"check_media_size - Media size is %dx%d, notifying application", info.width, info.height);
info.width = info.width * info.par_n / info.par_d;
GST_DEBUG ("Media size is %dx%d, notifying application", info.width, info.height);
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:height:)])
{
[self->ui_delegate mediaSizeChanged:info.width height:info.height];
}
}
gst_caps_unref(caps);
gst_object_unref (video_sink_pad);
gst_object_unref(video_sink);
}
```
The function call:
`g_object_get (self->pipeline, "video-sink", &video_sink, NULL);`
triggers the following warning:
> (:22803): GLib-GObject-�[1;33mWARNING�[0m **: �[34m08:43:06.226�[0m: g_object_get_is_valid_property: object class 'GstPipeline' has no property named 'video-sink'
As a result, the next 2 lines of code also fail:
```
video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_current_caps (video_sink_pad);
```
> (:22803): GStreamer-�[1;35mCRITICAL�[0m **: �[34m08:43:06.226�[0m: gst_element_get_static_pad: assertion 'GST_IS_ELEMENT (element)' failed.
> (:22803): GStreamer-�[1;35mCRITICAL�[0m **: �[34m08:43:06.226�[0m: gst_pad_get_current_caps: assertion 'GST_IS_PAD (pad)' failed
and finally, the instruction in the if(...) block always returns false
`if (gst_video_info_from_caps (&info, caps))`
therefore, the code inside the if-block is never executed and there's no way to receive the media info (e.g. width & height) of the video being played.
I do not understand why the pipeline does not seem to have a "video-sink" property, but the stream itself works and is displayed.
Any help/hint would be appreciated.
----
### Additional information
**GStreamer version:** I tested the behaviour with GStreamer 1.19.3 as well as 1.20.2.
On the server side (MacOs), I am running the following pipeline:
`gst-launch-1.0 -v autovideosrc ! "video/x-raw,framerate=30/1" ! queue ! jpegenc ! queue ! rtpjpegpay ! udpsink host=<IP_ADDRESS> port=8003`
On client side (iOS), this is the pipeline used to display the stream:
`udpsrc port=8003 ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2661516146, timestamp-offset=(uint)3924289949, seqnum-offset=(uint)1975 ! rtpbin ! rtpjpegdepay ! queue ! jpegdec ! queue ! videoconvert ! autovideosink sync=false`https://gitlab.freedesktop.org/gstreamer/gstreamer-project/-/issues/97iOS: Cannot retrieve width & height via "check_media_size" method2022-06-15T16:32:06ZSiegbaertiOS: Cannot retrieve width & height via "check_media_size" methodHello GStreamer community,
I am developing a native iOS application, which should be capable of displaying a live videostream.
My code is based on the official [GStreamer iOS tutorials](https://gstreamer.freedesktop.org/documentation/tu...Hello GStreamer community,
I am developing a native iOS application, which should be capable of displaying a live videostream.
My code is based on the official [GStreamer iOS tutorials](https://gstreamer.freedesktop.org/documentation/tutorials/ios/index.html?gi-language=c).
The stream itself works. It is displayed and I am quite satisfied with resolution, FPS and latency of the stream.
But I am not able to get the following method - which was taken from the [tutorials](https://gstreamer.freedesktop.org/documentation/tutorials/ios/a-basic-media-player.html?gi-language=c) - to return the expected result:
```objc
static void check_media_size (GStreamerBackend *self) {
GstElement *video_sink;
GstPad *video_sink_pad;
GstCaps *caps;
GstVideoInfo info;
/* Retrieve the Caps at the entrance of the video sink */
g_object_get (self->pipeline, "video-sink", &video_sink, NULL);
video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_current_caps (video_sink_pad);
if (gst_video_info_from_caps (&info, caps)) {
NSLog(@"check_media_size - Media size is %dx%d, notifying application", info.width, info.height);
info.width = info.width * info.par_n / info.par_d;
GST_DEBUG ("Media size is %dx%d, notifying application", info.width, info.height);
if (self->ui_delegate && [self->ui_delegate respondsToSelector:@selector(mediaSizeChanged:height:)])
{
[self->ui_delegate mediaSizeChanged:info.width height:info.height];
}
}
gst_caps_unref(caps);
gst_object_unref (video_sink_pad);
gst_object_unref(video_sink);
}
```
The function call:
`g_object_get (self->pipeline, "video-sink", &video_sink, NULL);`
triggers the following warning:
> (:22803): GLib-GObject-�[1;33mWARNING�[0m **: �[34m08:43:06.226�[0m: g_object_get_is_valid_property: object class 'GstPipeline' has no property named 'video-sink'
As a result, the next 2 lines of code also fail:
```
video_sink_pad = gst_element_get_static_pad (video_sink, "sink");
caps = gst_pad_get_current_caps (video_sink_pad);
```
> (:22803): GStreamer-�[1;35mCRITICAL�[0m **: �[34m08:43:06.226�[0m: gst_element_get_static_pad: assertion 'GST_IS_ELEMENT (element)' failed.
> (:22803): GStreamer-�[1;35mCRITICAL�[0m **: �[34m08:43:06.226�[0m: gst_pad_get_current_caps: assertion 'GST_IS_PAD (pad)' failed
and finally, the instruction in the if(...) block always returns false
`if (gst_video_info_from_caps (&info, caps))`
therefore, the code inside the if-block is never executed and there's no way to receive the media info (e.g. width & height) of the video being played.
I do not understand why the pipeline does not seem to have a "video-sink" property, but the stream itself works and is displayed.
Any help/hint would be appreciated.
----
### Additional information
**GStreamer version:** I tested the behaviour with GStreamer 1.19.3 as well as 1.20.2.
On the server side (MacOs), I am running the following pipeline:
`gst-launch-1.0 -v autovideosrc ! "video/x-raw,framerate=30/1" ! queue ! jpegenc ! queue ! rtpjpegpay ! udpsink host=<IP_ADDRESS> port=8003`
On client side (iOS), this is the pipeline used to display the stream:
`udpsrc port=8003 ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, payload=(int)26, ssrc=(uint)2661516146, timestamp-offset=(uint)3924289949, seqnum-offset=(uint)1975 ! rtpbin ! rtpjpegdepay ! queue ! jpegdec ! queue ! videoconvert ! autovideosink sync=false`https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1280video-format: Rename P010_10{LE,BE} to P010{LE,BE}2022-06-15T00:09:48ZSeungha Yangseungha@centricular.comvideo-format: Rename P010_10{LE,BE} to P010{LE,BE}`P010` is an industry standard format and the format name is pretty much self-explanatory.
I believe `P010_10LE` naming makes it a bit exotic and also it is unnecessarily duplicating the same meaning into its format name. (I haven't se...`P010` is an industry standard format and the format name is pretty much self-explanatory.
I believe `P010_10LE` naming makes it a bit exotic and also it is unnecessarily duplicating the same meaning into its format name. (I haven't seen such naming outside of gstreamer)