GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2022-11-29T13:09:52Zhttps://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/109When streaming to iPhone using Tutorial 3, shows "no element udpsrc"2022-11-29T13:09:52ZSreeja GWhen streaming to iPhone using Tutorial 3, shows "no element udpsrc"Created an application in Xcode to run Tutorial 3. version 1.18.6. After running the app it shows the playback video.I need to stream the video from MacBook internal camera to iPhone. I added Gstreamer development and runtime applicatio...Created an application in Xcode to run Tutorial 3. version 1.18.6. After running the app it shows the playback video.I need to stream the video from MacBook internal camera to iPhone. I added Gstreamer development and runtime applications in OSX with version 1.20.4 using brew.
Run the terminal command in OSX: gst-launch-1.0 -v avfvideosrc device-index=0 ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=200 speed-preset=superfast ! queue! udpsink host=192.168.0.108 port=5005
Then MacBook camera is turn on and started streaming.
Then I added the pipeline as
gst_parse_launch("udpsrc port=5005 ! application/x-rtp,encoding-name=H265,payload=96 ! rtph265depay ! h265parse ! queue ! avdec_h265 ! autovideosink sync=false", &error);
in Xcode application which include GStream framework universal (1.18.6) in Library/Developer/Frameworks/Gstreamer/iPhone.sdk/.
But I got the error "Unable to build pipeline: no element "udpsrc".
attached the screenshots.![Screenshot_2022-11-29_at_6.31.14_PM](/uploads/02797ceae598643369cc946e4c332278/Screenshot_2022-11-29_at_6.31.14_PM.png)
![Screenshot_2022-11-29_at_6.31.58_PM](/uploads/2315540422e3f8fc12f396b351438d56/Screenshot_2022-11-29_at_6.31.58_PM.png)![Screenshot_2022-11-29_at_6.38.55_PM](/uploads/9cc1dbbfdd3459228d00347a7a8105c3/Screenshot_2022-11-29_at_6.38.55_PM.png)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1611Green screen with hardware acceleration2022-11-28T19:54:30ZVuk MirovicGreen screen with hardware acceleration### Describe your issue
When playing certain videos with hardware acceleration it shows only green screen. Disabling hardware acceleration fixes issue. Originally reported on Totem issue tracker https://gitlab.gnome.org/GNOME/totem/-/iss...### Describe your issue
When playing certain videos with hardware acceleration it shows only green screen. Disabling hardware acceleration fixes issue. Originally reported on Totem issue tracker https://gitlab.gnome.org/GNOME/totem/-/issues/552. Maintainer suggested it's gstreamer issue.
#### Expected Behavior
Video plays normally.
#### Observed Behavior
Green screen.
#### Setup
- **Operating System:** Debian Testing
- **Device:** Computer
- **GStreamer Version:** 1.20.4
- **Command line:**
- **VAinfo:**
```
flatpak run org.freedesktop.Platform.VaInfo
Trying display: wayland
libva info: VA-API version 1.15.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/radeonsi_drv_video.so
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/intel-vaapi-driver/radeonsi_drv_video.so
libva info: Trying to open /usr/lib/x86_64-linux-gnu/GL/lib/dri/radeonsi_drv_video.so
libva info: Found init function __vaDriverInit_1_15
libva info: va_openDriver() returns 0
vainfo: VA-API version: 1.15 (libva 2.16.0)
vainfo: Driver version: Mesa Gallium driver 22.1.7 for AMD RENOIR (LLVM 14.0.6, DRM 3.48, 6.0.0-4-amd64)
vainfo: Supported profile and entrypoints
VAProfileMPEG2Simple : VAEntrypointVLD
VAProfileMPEG2Main : VAEntrypointVLD
VAProfileVC1Simple : VAEntrypointVLD
VAProfileVC1Main : VAEntrypointVLD
VAProfileVC1Advanced : VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
VAProfileH264Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointEncSlice
VAProfileH264High : VAEntrypointVLD
VAProfileH264High : VAEntrypointEncSlice
VAProfileHEVCMain : VAEntrypointVLD
VAProfileHEVCMain : VAEntrypointEncSlice
VAProfileHEVCMain10 : VAEntrypointVLD
VAProfileHEVCMain10 : VAEntrypointEncSlice
VAProfileJPEGBaseline : VAEntrypointVLD
VAProfileVP9Profile0 : VAEntrypointVLD
VAProfileVP9Profile2 : VAEntrypointVLD
VAProfileNone : VAEntrypointVideoProc
```
### Steps to reproduce the bug
<!-- please fill in exact steps which reproduce the bug on your system, for example: -->
1. Install Totem devel with flatpak install https://flathub.org/beta-repo/appstream/org.gnome.Totem.Devel.flatpakref
2. play sample video ![Sample_1](/uploads/e5e23b9f87536263a0629f4b5325c59e/Sample_1.mp4)
3. observe green screen
4. go to Preferences -> Display -> Disable hardware acceleration
5. video plays normal
### How reproducible is the bug?
<!-- The reproducibility of the bug is Always/Intermittent/Only once after doing a very specific set of steps-->
Always
### Screenshots if relevant
![Screenshot_from_2022-11-28_16-23-34](/uploads/a601c1857ea3d7760f8b07775af98556/Screenshot_from_2022-11-28_16-23-34.png)
### Solutions you have tried
Disabling hardware acceleration
### Related non-duplicate issues
### Additional Information
<!-- Any other information such as logs. Make use of <details> for long output -->https://gitlab.freedesktop.org/gstreamer/gstreamer-vaapi/-/issues/336vaapih264dec produces green artifacts on Kaby Lake GPU with some input files2022-11-28T19:53:57ZFabrice Belletfabrice@bellet.infovaapih264dec produces green artifacts on Kaby Lake GPU with some input filesHi!
I'm running the following pipeline on this [input file](/uploads/be5e20301ad949bf1c61e66307df8694/input.mp4) (426KB, 50 frames):
`gst-launch-1.0 filesrc location=input.mp4 ! qtdemux ! vaapih264dec ! videoconvert ! jpegenc ! multifi...Hi!
I'm running the following pipeline on this [input file](/uploads/be5e20301ad949bf1c61e66307df8694/input.mp4) (426KB, 50 frames):
`gst-launch-1.0 filesrc location=input.mp4 ! qtdemux ! vaapih264dec ! videoconvert ! jpegenc ! multifilesink location="frame%02d.jpeg"`
and it generates green artifacts (starting on [frame01.jpeg](/uploads/8a2fb291bc860ab77ea18207c924258e/frame01.jpeg), mostly visible on [frame37.jpeg](/uploads/01d39259bd948bff9593bbc36320ed10/frame37.jpeg)).
* It happens with both vaapih264dec and vah264dec elements (gstreamer1-1.20.3-1.1.fc36.x86_64)
* It happens with both iHD (Intel iHD driver for Intel(R) Gen Graphics - 22.3.1 ()) and i965 (Intel i965 driver for Intel(R) Kaby Lake - 2.4.1) va drivers
* It works on another computer with a radeonsi va driver (Mesa Gallium driver 22.1.7 for AMD PITCAIRN (LLVM 14.0.0, DRM 2.50, 6.0.9-200.fc36.x86_64))
* It works with ffmpeg + vaapi + iHD driver with the following command (although it appears that output files have differences in colorimetry and smoothing, for example in [frame38.jpg](/uploads/64bb1c704a5e887af74217e2a30cf42a/frame38.jpg), frames numbering starts at 1 in ffmpeg) :
`ffmpeg -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -i input.mp4 -frames:v 50 frame%02d.jpg`https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1743tsdemux:The huge PTS causes abnormal playback2022-11-28T12:45:43ZHmily_gg fftsdemux:The huge PTS causes abnormal playback**[Phenomenon]**
- When we plaback the test file,the total duration is 0:00:02.1,but current time is always updated and more than total duration.
Meanwhile,the Video frame is freeze.
**[Analysis]**
- The given test file 0000003281-00000...**[Phenomenon]**
- When we plaback the test file,the total duration is 0:00:02.1,but current time is always updated and more than total duration.
Meanwhile,the Video frame is freeze.
**[Analysis]**
- The given test file 0000003281-0000003301.mpg has huge PTS that leading EOS event is blocked.[0000003281-0000003301.mpg](/uploads/a9a5b50fa2830f9adb5cc6c2e1649f28/0000003281-0000003301.mpg).But the total duration is Not so long(0:00:02.1?).
Fragment log:
- tsdemux.c:2646:gst_ts_demux_parse_pes_header:<tsdemux0>�[00m stream **PTS 17:03:07.911980999** DTS 17:03:07.861925444
- tsdemux.c:2646:gst_ts_demux_parse_pes_header:<tsdemux0>�[00m stream **PTS 99:99:99.999999999** DTS 99:99:99.999999999
**[Expected Behavior]**
- Detect huge PTS, throw ERROR messge from GBUS.
**[Setup]**
- Operating System: Ubuntu 22.04
- Device: Computer
- GStreamer Version: 1.20.3.1
- Command line: gst-play-1.0 ./0000003281-0000003301.mpg
**[Steps to reproduce the bug]**
- Download 0000003281-0000003301.mpg
- Run gst-play-1.0 ./0000003281-0000003301.mpg
**[How reproducible is the bug]**
- Alwayshttps://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1744nvav1enc:Add support for av1 hardware encoding2022-11-28T12:44:40Z谢美龙nvav1enc:Add support for av1 hardware encodingThe 8th generation nvenc has added support for av1 encoding. [Support Matrix](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new).
And rtpav1pay just released in gstreamer 1.21.1. The nvav1enc + rtpav1pay + webr...The 8th generation nvenc has added support for av1 encoding. [Support Matrix](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new).
And rtpav1pay just released in gstreamer 1.21.1. The nvav1enc + rtpav1pay + webrtc will be a nice combination.
[orign issue](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1489)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1606avdec_h264: Sometimes produces no duration2022-11-28T08:01:35ZHugo Svirakavdec_h264: Sometimes produces no durationIn our case we're using `webrtcbin`, I believe the issue only occurs when `framerate=0/1`. The issue only happens for some videos.
This causes `videorate` to fail / assert with a no duration error.
The following bug report seems to des...In our case we're using `webrtcbin`, I believe the issue only occurs when `framerate=0/1`. The issue only happens for some videos.
This causes `videorate` to fail / assert with a no duration error.
The following bug report seems to describe essentially the same issue: https://gstreamer-bugs.narkive.com/9LhWWUd6/bug-734425-new-h264parse-outputting-framerate-0-1-causes-regressions
Is this a bug in `avdec_h264`?https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/270Critical error on fmp4 dashmp4mux element2022-11-27T18:55:05ZBedilbek KhamidovCritical error on fmp4 dashmp4mux elementGStreamer version: 1.20.3
fmp4 Version: 0.9.1-274e57a5
Rust version: rustc 1.65.0 (897e37553 2022-11-02)
OS: Ubuntu 22.04
Linux kernel version: 5.15.0-52-generic
Hi. I am building a pipeline to split a video into fragmented mp...GStreamer version: 1.20.3
fmp4 Version: 0.9.1-274e57a5
Rust version: rustc 1.65.0 (897e37553 2022-11-02)
OS: Ubuntu 22.04
Linux kernel version: 5.15.0-52-generic
Hi. I am building a pipeline to split a video into fragmented mp4 1s chunks with gstreamer. The below command is causing an error output:
```bash
gst-launch-1.0 filesrc location="sample_1080p_h265.mp4" ! qtdemux name=demux demux.video_0 ! libde265dec ! x265enc key-int-max=30 ! h265parse ! video/x-h265,stream-format=hvc1,alignment=au ! .video splitmuxsink location="fragments-dash/%02d.mp4" max-size-time=1000000000 muxer=dashmp4mux
```
However same command with default `mp4mux` muxer is working fine.
Here is log output of the command:
```bash
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstLibde265Dec:libde265dec0: Unsupported extra data version 1, decoding may fail
Additional debug info:
../ext/libde265/libde265-dec.c(583): gst_libde265_dec_set_format (): /GstPipeline:pipeline0/GstLibde265Dec:libde265dec0
Redistribute latency...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
thread '<unnamed>' panicked at 'called `Option::unwrap()` on a `None` value', mux/fmp4/src/fmp4mux/imp.rs:650:34
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
ERROR: from element /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0: GStreamer encountered a general supporting library error.
Additional debug info:
/home/bedilbek/.cargo/git/checkouts/gstreamer-rs-79e52a2d27eb91a3/5dd56d8/gstreamer-base/src/subclass/aggregator.rs(876): gstreamer_base::subclass::aggregator (): /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0:
Panicked: called `Option::unwrap()` on a `None` value
Execution ended after 0:00:34.788963538
Setting pipeline to NULL ...
ERROR: from element /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0: GStreamer encountered a general supporting library error.
Additional debug info:
/home/bedilbek/.cargo/git/checkouts/gstreamer-rs-79e52a2d27eb91a3/5dd56d8/gstreamer/src/subclass/element.rs(443): gstreamer::subclass::element (): /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0:
Panicked
ERROR: from element /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0: GStreamer encountered a general supporting library error.
Additional debug info:
/home/bedilbek/.cargo/git/checkouts/gstreamer-rs-79e52a2d27eb91a3/5dd56d8/gstreamer/src/subclass/element.rs(443): gstreamer::subclass::element (): /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0:
Panicked
ERROR: from element /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0: GStreamer encountered a general supporting library error.
Additional debug info:
/home/bedilbek/.cargo/git/checkouts/gstreamer-rs-79e52a2d27eb91a3/5dd56d8/gstreamer/src/subclass/element.rs(443): gstreamer::subclass::element (): /GstPipeline:pipeline0/GstSplitMuxSink:splitmuxsink0/GstDASHMP4Mux:dashmp4mux0:
Panicked
Freeing pipeline ...
```https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/issues/401ci: Only run manually and make use of the merge bot2022-11-27T17:59:28ZSebastian Drögeci: Only run manually and make use of the merge botShould get basically the same configuration as the monorepo. Same applies to gstreamer-rs / gst-plugins-rs.
CC @alatieraShould get basically the same configuration as the monorepo. Same applies to gstreamer-rs / gst-plugins-rs.
CC @alatieraJordan PetridіsJordan Petridіshttps://gitlab.freedesktop.org/gstreamer/meson-ports/ffmpeg/-/issues/30Installation on Python 3.6 throws an error while compiling2022-11-27T16:20:53ZSeungmin KimInstallation on Python 3.6 throws an error while compiling```bash
[4027/6612] Generating subprojects/FFmpeg/avutil-def with a custom command (wrapped by meson to capture output)
FAILED: subprojects/FFmpeg/avutil.def
/usr/local/bin/meson --internal exe --capture subprojects/FFmpeg/avutil.def --...```bash
[4027/6612] Generating subprojects/FFmpeg/avutil-def with a custom command (wrapped by meson to capture output)
FAILED: subprojects/FFmpeg/avutil.def
/usr/local/bin/meson --internal exe --capture subprojects/FFmpeg/avutil.def -- /usr/bin/python3 /src/gstreamer/subprojects/FFmpeg/compat/windows/makedef.py --nm /usr/bin/nm --prefix '' /src/gstreamer/builddir/subprojects/FFmpeg/libavutil/libavutil.ver /src/gstreamer/builddir/subprojects/FFmpeg/libavutil-static.a
--- stderr ---
Traceback (most recent call last):
File "/src/gstreamer/subprojects/FFmpeg/compat/windows/makedef.py", line 74, in <module>
'-g', libname], capture_output=True, text=True, check=True)
File "/usr/lib/python3.6/subprocess.py", line 423, in run
with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'capture_output'
[4028/6612] Compiling C object subprojects/FFmpeg/test_avutil_color_utils.p/libavutil_tests_color_utils.c.o
[4029/6612] Linking target subprojects/FFmpeg/test_avutil_cpu
ninja: build stopped: subcommand failed.
```
On Ubuntu 18.04 with the default Python 3.6 version. Build fails on 1.20.3 and 1.20.4. `capture_output` is only on Python 3.7+, more compatible way is to use `stdout=subprocess.PIPE, stderr=subprocess.PIPE` on `subprocess.run`.https://gitlab.freedesktop.org/gstreamer/meson-ports/ffmpeg/-/issues/17Cannot enable libvpx_vp8_encoder2022-11-27T15:56:53ZJulien IsorceCannot enable libvpx_vp8_encoderI am trying to enable `avenc_vp8` for testing purpose to compare with `vp8enc`
but when building FFmpeg gst-build and its subprojects/FFmpeg.wrapm I get:
Running: `meson build -Dlibvpx_vp8_encoder=enable`
->
```
Dependency vpx found:...I am trying to enable `avenc_vp8` for testing purpose to compare with `vp8enc`
but when building FFmpeg gst-build and its subprojects/FFmpeg.wrapm I get:
Running: `meson build -Dlibvpx_vp8_encoder=enable`
->
```
Dependency vpx found: YES (cached)
Has header "vpx/vpx_encoder.h" with dependency vpx: YES
Has header "vpx/vp8cx.h" with dependency vpx: YES
Checking for function "vpx_codec_vp8_cx" with dependency vpx: YES
Dependency vpx found: YES (cached)
```
but `libvpx_vp8_encoder=0` in meson-log.txt
So I wonder if there is some high level config file I would be missing and that disables vp8 encoder.
Thx!https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/69playbin: read ahead files, better support for network shares and other slow i...2022-11-27T00:37:09ZBugzilla Migration Userplaybin: read ahead files, better support for network shares and other slow inputs## Submitted by Andrea Corbellini
**[Link to original bug (#679708)](https://bugzilla.gnome.org/show_bug.cgi?id=679708)**
## Description
Playing files and devices located in not-so-low-latency locations with Totem currently is not a...## Submitted by Andrea Corbellini
**[Link to original bug (#679708)](https://bugzilla.gnome.org/show_bug.cgi?id=679708)**
## Description
Playing files and devices located in not-so-low-latency locations with Totem currently is not always optimal. Examples of such locations include slow DVDs, SSH shares and NBD devices.
When playing a movie from those locations I can observe the following:
* the first N minutes of the movie is played perfectly well;
* after those N minutes, the movie stops for a while;
* the movie continues again for N minutes;
* the movie re-stops and so on...
It seems to me (though I might be wrong) that Totem fetches only when its read buffer is exhausted. To play files reliably, Totem should instead continuously fill the buffer, trying to always keep it at a reasonable size.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1602gst-plugins-base fails to build in old distros like centos 72022-11-25T14:31:55ZIgnacio Casalgst-plugins-base fails to build in old distros like centos 7Build failures like the following:
```
FAILED: gst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p/video-converter.c.o
cc -Igst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p -Igst-libs/gst/video -I../gst-libs/gst/video -I. -I.. -Igst-libs ...Build failures like the following:
```
FAILED: gst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p/video-converter.c.o
cc -Igst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p -Igst-libs/gst/video -I../gst-libs/gst/video -I. -I.. -Igst-libs -I../gst-libs -I/tmp/build-artifacts.sh-LPkBP/build/inst/include/gstreamer-1.0 -I/tmp/build-artifacts.sh-LPkBP/build/inst/include/glib-2.0 -I/tmp/build-artifacts.sh-LPkBP/build/inst/lib/glib-2.0/include -I/tmp/build-artifacts.sh-LPkBP/build/inst/include/orc-0.4 -D_FILE_OFFSET_BITS=64 -Wall -Winvalid-pch -O2 -g -fvisibility=hidden -fno-strict-aliasing -DG_DISABLE_CAST_CHECKS -Wmissing-declarations -Wredundant-decls -Wundef -Wwrite-strings -Wformat -Wformat-nonliteral -Wformat-security -Winit-self -Wmissing-include-dirs -Waddress -Wno-multichar -Wvla -Wpointer-arith -Wmissing-prototypes -Wdeclaration-after-statement -fstack-protector -g -O2 -fno-strict-aliasing -Wformat -D_FORTIFY_SOURCE=2 -fPIC -pthread -DHAVE_CONFIG_H -DBUILDING_GST_VIDEO '-DG_LOG_DOMAIN="GStreamer-Video"' -MD -MQ gst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p/video-converter.c.o -MF gst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p/video-converter.c.o.d -o gst-libs/gst/video/libgstvideo-1.0.so.0.2001.0.p/video-converter.c.o -c ../gst-libs/gst/video/video-converter.c
../gst-libs/gst/video/video-converter.c: In function 'convert_I420_v210':
../gst-libs/gst/video/video-converter.c:3771:7: error: 'for' loop initial declarations are only allowed in C99 mode
for (int j = width * 4 - 1; j >= 0; j--) {
^
../gst-libs/gst/video/video-converter.c:3771:7: note: use option -std=c99 or -std=gnu99 to compile your code
../gst-libs/gst/video/video-converter.c: In function 'convert_v210_I420':
../gst-libs/gst/video/video-converter.c:4018:7: error: 'for' loop initial declarations are only allowed in C99 mode
for (int j = 0; j < width * 4; j++) {
^
```
I wonder if this could be fixed upstream in the same way that it was done for glib by passing `'c_std=gnu99'` to meson default_options. Thoughts?https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/273gtk4: Generate the gdk Textures from GStreamer in the PaintableSink2022-11-25T13:59:09ZJordan Petridіsgtk4: Generate the gdk Textures from GStreamer in the PaintableSinkThe following discussion from !588 should be addressed:
- [ ] @slomo started a [discussion](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/588#note_1247990): (+3 comments)
> `gdk::Texture` is `Send + Sync...The following discussion from !588 should be addressed:
- [ ] @slomo started a [discussion](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/588#note_1247990): (+3 comments)
> `gdk::Texture` is `Send + Sync` now, which might allow some simplifications here. You could create the textures on the GStreamer streaming thread already and then just send that through the channel.
> We could potentially create `gdk::Texture`s already in `BaseSinkImpl::render` / `VideoSinkImpl::show_frame` from the GStreamer thread, and then move those into the UI thread for displayinghttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1600valgrind: consolidate all suppression files into a single file2022-11-25T13:40:29ZMathieu Duponchellevalgrind: consolidate all suppression files into a single fileA single valgrind suppression file would be much more handy, any reason not to switch to that?A single valgrind suppression file would be much more handy, any reason not to switch to that?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1223va: Fix/add buffer stride/offset validation2022-11-25T12:36:33ZNicolas Dufresneva: Fix/add buffer stride/offset validationOn my system, Vega64, VA element tries P010 decoding and exporting that to DMABuf, and that goes terribly wrong and fail.
```
0:08:13.958544331 410736 0x7fff840990c0 ERROR videometa gstvideometa.c:242:default_map: plane 1, ...On my system, Vega64, VA element tries P010 decoding and exporting that to DMABuf, and that goes terribly wrong and fail.
```
0:08:13.958544331 410736 0x7fff840990c0 ERROR videometa gstvideometa.c:242:default_map: plane 1, no memory at offset 16588800
0:08:13.958598383 410736 0x7fff840990c0 ERROR default video-frame.c:168:gst_video_frame_map_id: failed to map video frame plane 1
```
While debugging that, I've stumbled what looks like driver bug, but also pretty much dislike the sloppiness of the dmabuf exporting code (gst_va_dmabuf_allocator_setup_buffer_full()). This function is partially implemented and is missing any kind of validation. Here's the prime descriptor that I'm getting:
```
$4 = {
fourcc = 808530000,
width = 3840,
height = 2160,
num_objects = 2,
objects = {{
fd = 28,
size = 8294400,
drm_format_modifier = 0
}, {
fd = 29,
size = 8294400,
drm_format_modifier = 0
}, {
fd = 0,
size = 0,
drm_format_modifier = 0
}, {
fd = 0,
size = 0,
drm_format_modifier = 0
}},
num_layers = 2,
layers = {{
drm_format = 540422482,
num_planes = 1,
object_index = {0, 0, 0, 0},
offset = {0, 0, 0, 0},
pitch = {7680, 0, 0, 0}
}, {
drm_format = 842224199,
num_planes = 1,
object_index = {1, 0, 0, 0},
offset = {16588800, 0, 0, 0},
pitch = {7680, 0, 0, 0}
}, {
drm_format = 0,
num_planes = 0,
object_index = {0, 0, 0, 0},
offset = {0, 0, 0, 0},
pitch = {0, 0, 0, 0}
}, {
drm_format = 0,
num_planes = 0,
object_index = {0, 0, 0, 0},
offset = {0, 0, 0, 0},
pitch = {0, 0, 0, 0}
}}
}
```
So basically two DMABuf and 2 layers. That makes sense, more of less, I'm a little surprise that the offset aren't relative to the object starts here if you see what I mean. But perhaps this is documented somewhere ? In short, first bug, the size of the first DMABuf should have been at least 16588800 (which matches the offset), but has been set to 8294400. The pitch looks fine, the second dmabuf size is fine, since its sub-sampled. We should be able to fail early and cleanly on this type of driver error imho.
cc @He_Junyan @vjaquezhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/771RTSP Url with starttime does not give correct video stream2022-11-24T14:51:14ZRavi KumarRTSP Url with starttime does not give correct video stream### Describe your issue
I am trying to stream videos from a HIKVISION DVR. The RTSP url has a starttime and endtime to stream a video from that time range. Example URL: rtsp://admin:admin@10.96.251.207/Streaming/tracks/401/?starttime=202...### Describe your issue
I am trying to stream videos from a HIKVISION DVR. The RTSP url has a starttime and endtime to stream a video from that time range. Example URL: rtsp://admin:admin@10.96.251.207/Streaming/tracks/401/?starttime=20211010T141904Z&endtime=20211010T144059Z
#### Expected Behavior
The RTSP should stream video from time mentioned in **Starttime**
#### Observed Behavior
When using GStreamer pipeline the video streamed was from earlier time than starttime.
When streaming using FFMPEG the video streamed was correct as per starttime.
#### Setup
- **Operating System:** Rehl 7
- **Device:** Linux Vms
- **GStreamer Version:** 1.18.4
- **Command line:** gst-launch-1.0 rtspsrc location='rtsp://admin:admin@10.96.251.207/Streaming/tracks/401/?starttime=20211010T141904Z&endtime=20211010T144059Z' name=rtsp ! rtph264depay ! h264parse ! splitmuxsink name=sink mux=mp4mux max-size-time=450000000000 location=video%02d.mp4https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1515Base parse fails to infer the correct PTS when splitting one video frame int...2022-11-24T04:44:02ZHe JunyanBase parse fails to infer the correct PTS when splitting one video frame into several sub buffers.### Describe your issue
<!-- a clear and concise summary of the bug. -->
Base parse fails to infer the correct PTS when splitting one video frame into several sub buffers.
<!-- For any GStreamer usage question, please contact the commun...### Describe your issue
<!-- a clear and concise summary of the bug. -->
Base parse fails to infer the correct PTS when splitting one video frame into several sub buffers.
<!-- For any GStreamer usage question, please contact the community using the #gstreamer channel on IRC https://www.oftc.net/ or the mailing list on https://gstreamer.freedesktop.org/lists/ -->
#### Observed Behavior
<!-- What did you expect to happen -->
There are requests that we need to split one video frame into several sub buffers. For example, if we want to split the MKV stream([multi_slice_h264.mkv](/uploads/987119b6b81a63463ec341b6fd5e3f06/multi_slice_h264.mkv)) into h264 NALs, using command line:
```
gst-launch-1.0 -vf filesrc location=multi_slice_h264.mkv ! matroskademux name=de de.video_0 ! h264parse ! video/x-h264,alignment=nal ! fakesin
k silent=false
```
The output is:
```
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (34 bytes, dts: none, pts: 0:00:00.000000000, duration: 0:00:00.000000000, offset: -1, offset_end: -1, flags: 00006440 discont header delta-unit tag-memory , meta: none) 0x7f3df001f120
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (8 bytes, dts: none, pts: none, duration: 0:00:00.000000000, offset: -1, offset_end: -1, flags: 00006440 discont header delta-unit tag-memory , meta: none) 0x7f3df001f480
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (288 bytes, dts: none, pts: none, duration: 0:00:00.033333333, offset: -1, offset_end: -1, flags: 00004040 dis
cont tag-memory , meta: none) 0x7f3df001f120
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (289 bytes, dts: none, pts: none, duration: 0:00:00.000000000, offset: -1, offset_end: -1, flags: 00004040 dis
cont tag-memory , meta: none) 0x560e960ccb40
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (4832 bytes, dts: none, pts: none, duration: 0:00:00.000000000, offset: -1, offset_end: -1, flags: 00004040 di
scont tag-memory , meta: none) 0x7f3df001f120
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (73 bytes, dts: none, pts: 0:00:00.033000000, duration: 0:00:00.033333333, offset: -1, offset_end: -1, flags:
00006000 delta-unit tag-memory , meta: none) 0x560e960cc6c0
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (74 bytes, dts: none, pts: none, duration: 0:00:00.000000000, offset: -1, offset_end: -1, flags: 00006000 delt
a-unit tag-memory , meta: none) 0x7f3df001f480
/GstPipeline:pipeline0/GstFakeSink:fakesink0: last-message = chain ******* (fakesink0:sink) (3480 bytes, dts: none, pts: none, duration: 0:00:00.000000000, offset: -1, offset_end: -1, flags: 00006000 de
lta-unit tag-memory , meta: none) 0x560e960cc6c0
```
We notice that only the first split output buffer of one input frame has PTS and others are none.
#### Expected Behavior
Each buffer should have correct PTS. Here, each sub buffers should have the same PTS as the input frame buffer.
### Solutions you have tried
The same kind of issue also exists in VP9 and AV1 parse(maybe more). The base parse https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gstreamer/libs/gst/base/gstbaseparse.c#L2407 will always wipe out the PTS and DTS and infer a new one for us, which is not correct.
### Related non-duplicate issues
We already have a lot of discuss at:
https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3182#note_1591585https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1599Fix crash in gst_mf_dshow_enum_device2022-11-23T15:31:00ZJan LorenzFix crash in gst_mf_dshow_enum_deviceAccording to the MSDN documentation here: https://learn.microsoft.com/en-us/previous-versions/ms784969(v=vs.85)
the result of `CreateClassEnumerator` must be checked directly against `S_OK`. The gst_mf_result() function uses FAILED or S...According to the MSDN documentation here: https://learn.microsoft.com/en-us/previous-versions/ms784969(v=vs.85)
the result of `CreateClassEnumerator` must be checked directly against `S_OK`. The gst_mf_result() function uses FAILED or SUCCEEDED internally and the test succeeds despite the actual ComPtr being null, and lateron this results in a crash during `enum_moniker->Next`
The attached patch uses correct approach to check the result: [702f52d054bf6b13e80f2ce7b779f068ab1fe7f5.patch](/uploads/d8c3f8d0ccef09aadec9b915d94bd136/702f52d054bf6b13e80f2ce7b779f068ab1fe7f5.patch)
This was tested on gstreamer 1.21.2.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1588playbin3: about-to-finish signal missing after short network streams2022-11-23T13:05:33ZRobert Tiemannplaybin3: about-to-finish signal missing after short network streams
The issue can be reproduced with `gst-play-1.0`, playing some short files from a DLNA server:
`gst-play-1.0 --use-playbin3 --gapless http://my.dlna.server/20_seconds.flac http://my.dlna.server/5_seconds.flac http://my.dlna.server/20_se...
The issue can be reproduced with `gst-play-1.0`, playing some short files from a DLNA server:
`gst-play-1.0 --use-playbin3 --gapless http://my.dlna.server/20_seconds.flac http://my.dlna.server/5_seconds.flac http://my.dlna.server/20_seconds.flac`.
The files [20_seconds.flac](/uploads/dd5ef5606642dd95573e4c1391e182ee/20_seconds.flac) and [5_seconds.flac](/uploads/a914eb7004879151cee85138046da414/5_seconds.flac) are 20 and 5 seconds long, respectively.
- Playing the files directly with `gst-play-1.0` in the order suggested above prints `About to finish, preparing next title` twice as expected.
- Playing the files from a DLNA server, however, prints `About to finish, preparing next title` only once near the end of the first stream. The expected message near the end of the second stream is missing.
- The short stream must be preceeded by a long stream to trigger the bug. When starting playback with the short stream, the `about-to-finish` signal is sent, but immediately at the beginning instead of near the end of the short stream.
Tested with version b2bfb066ec4409d1112af000e82a8510a0d1eca4 on main branch, 100% reproducible.Edward HerveyEdward Herveyhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1596kmssink: Fails to compile `drm/drm_mode.h: No such file or directory`2022-11-23T09:50:10ZSebastian Drögekmssink: Fails to compile `drm/drm_mode.h: No such file or directory`The following discussion from !3303 should be addressed:
- [ ] @slomo started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3303#note_1651278):
> How does this compile for you?
>
> ```...The following discussion from !3303 should be addressed:
- [ ] @slomo started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3303#note_1651278):
> How does this compile for you?
>
> ```
> ../subprojects/gst-plugins-bad/sys/kms/gstkmssink.h:30:10: fatal error: drm/drm_mode.h: No such file or directory
> 30 | #include <drm/drm_mode.h>
> | ^~~~~~~~~~~~~~~~
> compilation terminated.
> ```
>
> The file exists here as `/usr/include/libdrm/drm_mode.h` and based on the include flags, `#include <drm_mode.h` works for me.1.21.3