GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2024-01-30T22:23:52Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2454Gst.ValueArray containing other Gst.ValueArrays is not accepted as property v...2024-01-30T22:23:52ZChristian GlodtGst.ValueArray containing other Gst.ValueArrays is not accepted as property value where it should beI'm trying to set the 'mix-matrix' property of an audioconvert element from Python. The property type is a Gst.ValueArray containing Gst.ValueArrays that contain floats.
I'm on these versions:
- OS: Ubuntu 22.10
- python3-gst-1.0 packag...I'm trying to set the 'mix-matrix' property of an audioconvert element from Python. The property type is a Gst.ValueArray containing Gst.ValueArrays that contain floats.
I'm on these versions:
- OS: Ubuntu 22.10
- python3-gst-1.0 package: 1.20.3-1
- gstreamer1.0-plugins-base package: 1.20.3-2
Here's an example where it fails:
```python
import sys
import gi
gi.require_version('Gst', '1.0')
gi.require_version('GLib', '2.0')
from gi.repository import Gst, GObject, GLib
Gst.init(sys.argv)
audioconvert = Gst.ElementFactory.make('audioconvert')
mix_matrix = Gst.ValueArray([Gst.ValueArray([0.5, 0.5]), Gst.ValueArray([0.5, 0.5])])
audioconvert.set_property('mix-matrix', mix_matrix)
print(audioconvert.get_property('mix-matrix'))
```
The first time I set the property, I get a warning:
```
Warning: value "< < 0.500000, 0.500000 >, < 0.500000, 0.500000 > >" of type 'GstValueArray' is invalid or out of range for property 'mix-matrix' of type 'GstValueArray'
```
And the value I'm trying to set does not take:
```pyconsole
>>> print(audioconvert.get_property('mix-matrix'))
<>
```
If I try to set the mix-matrix again, I don't get a warning anymore, but the value still doesn't take.
Is there a pure-Python workaround for this problem? My current workaround is a custom Python C-extension that does it from C, which is quite inconvenient.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2424oggmux not creating valid files for opus received via RTP (unless explicit ca...2023-03-28T13:23:44ZTristan Matthewsoggmux not creating valid files for opus received via RTP (unless explicit caps or opusparse is present)With git (using 49c9f31803ad9728b653c1ce75094adeb889221e):
```
gst-launch-1.0 -e -v audiotestsrc num-buffers=1000 ! opusenc ! rtpopuspay ! rtpopusdepay ! oggmux ! filesink location=testsrc.ogg
```
the resulting file is missing the `Opus...With git (using 49c9f31803ad9728b653c1ce75094adeb889221e):
```
gst-launch-1.0 -e -v audiotestsrc num-buffers=1000 ! opusenc ! rtpopuspay ! rtpopusdepay ! oggmux ! filesink location=testsrc.ogg
```
the resulting file is missing the `OpusHead` header etc. and can't be played:
```
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstAudioTestSrc:audiotestsrc0.GstPad:src: caps = audio/x-raw, rate=(int)48000, channels=(int)1, format=(string)S16LE, layout=(string)interleaved
/GstPipeline:pipeline0/GstOpusEnc:opusenc0.GstPad:sink: caps = audio/x-raw, rate=(int)48000, channels=(int)1, format=(string)S16LE, layout=(string)interleaved
Redistribute latency...
/GstPipeline:pipeline0/GstOpusEnc:opusenc0.GstPad:src: caps = audio/x-opus, rate=(int)48000, channels=(int)1, channel-mapping-family=(int)0, stream-count=(int)1, coupled-count=(int)0, streamheader=(buffer)< 4f707573486561640101380180bb0000000000, 4f707573546167731e000000456e636f6465642077697468204753747265616d6572206f707573656e63010000001a0000004445534352495054494f4e3d617564696f74657374207761766501 >
/GstPipeline:pipeline0/GstRtpOPUSPay:rtpopuspay0.GstPad:src: caps = application/x-rtp, media=(string)audio, clock-rate=(int)48000, encoding-name=(string)OPUS, sprop-stereo=(string)0, encoding-params=(string)2, sprop-maxcapturerate=(string)48000, payload=(int)96, ssrc=(uint)4224652647, timestamp-offset=(uint)281281597, seqnum-offset=(uint)28254
/GstPipeline:pipeline0/GstRTPOpusDepay:rtpopusdepay0.GstPad:src: caps = audio/x-opus, channel-mapping-family=(int)0, channels=(int)1, rate=(int)48000
/GstPipeline:pipeline0/GstOggMux:oggmux0.GstPad:audio_889019218: caps = audio/x-opus, channel-mapping-family=(int)0, channels=(int)1, rate=(int)48000
/GstPipeline:pipeline0/GstRTPOpusDepay:rtpopusdepay0.GstPad:sink: caps = application/x-rtp, media=(string)audio, clock-rate=(int)48000, encoding-name=(string)OPUS, sprop-stereo=(string)0, encoding-params=(string)2, sprop-maxcapturerate=(string)48000, payload=(int)96, ssrc=(uint)4224652647, timestamp-offset=(uint)281281597, seqnum-offset=(uint)28254
/GstPipeline:pipeline0/GstRtpOPUSPay:rtpopuspay0.GstPad:sink: caps = audio/x-opus, rate=(int)48000, channels=(int)1, channel-mapping-family=(int)0, stream-count=(int)1, coupled-count=(int)0, streamheader=(buffer)< 4f707573486561640101380180bb0000000000, 4f707573546167731e000000456e636f6465642077697468204753747265616d6572206f707573656e63010000001a0000004445534352495054494f4e3d617564696f74657374207761766501 >
/GstPipeline:pipeline0/GstRtpOPUSPay:rtpopuspay0: timestamp = 281281597
/GstPipeline:pipeline0/GstRtpOPUSPay:rtpopuspay0: seqnum = 28254
0:00:00.026207252 2983687 0x562fc2a94800 ERROR oggmux gstoggmux.c:1063:gst_ogg_mux_queue_pads:<oggmux0:audio_889019218> mapper didn't recognise input stream (pad caps: audio/x-opus, channel-mapping-family=(int)0, channels=(int)1, rate=(int)48000)
/GstPipeline:pipeline0/GstOggMux:oggmux0.GstPad:src: caps = application/ogg, streamheader=(buffer)< 4f67675300020000000000000000525bfd3400000000d393351001fdf8b50e7d91cc0582757248c1edf330e4fc3244559a0eb22341bf71c0d7629492c7235296a7807c99b03cfbb4bbac96098d0a5a47ec34ed9fdbf0bf9486bb35e23f4209c156cb86395ba2c7992aaf8cbac090fa3cf6723caa903a901b2b75cf577e7b8c68b924943db538bcd4f57551503003800066a32ac441d1a8dec6c61a79d8784f38907dd8784e32d61f63b902884a55aed31e9b57c5aed3eb8fad2a11bb9d77bd8f28d2a6f7b3ff3e45ae9146b0f657518f9d03568f6a01bc2a02379b7d9e8efe99bad517386e3351abe963fdfab4fd8e3981e10623fca8fa68b9f0dc1859ffb4aa117aaa65b60d141ad643734b13d8ecbb7411297056d48f8f6d >
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = application/ogg, streamheader=(buffer)< 4f67675300020000000000000000525bfd3400000000d393351001fdf8b50e7d91cc0582757248c1edf330e4fc3244559a0eb22341bf71c0d7629492c7235296a7807c99b03cfbb4bbac96098d0a5a47ec34ed9fdbf0bf9486bb35e23f4209c156cb86395ba2c7992aaf8cbac090fa3cf6723caa903a901b2b75cf577e7b8c68b924943db538bcd4f57551503003800066a32ac441d1a8dec6c61a79d8784f38907dd8784e32d61f63b902884a55aed31e9b57c5aed3eb8fad2a11bb9d77bd8f28d2a6f7b3ff3e45ae9146b0f657518f9d03568f6a01bc2a02379b7d9e8efe99bad517386e3351abe963fdfab4fd8e3981e10623fca8fa68b9f0dc1859ffb4aa117aaa65b60d141ad643734b13d8ecbb7411297056d48f8f6d >
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Redistribute latency...
New clock: GstSystemClock
Got EOS from element "pipeline0".
EOS received - stopping pipeline...
Execution ended after 0:00:00.126034469
Setting pipeline to NULL ...
Freeing pipeline ...
```
If I put an `opusparse` or explicit caps before the oggmux, the file is OK.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/334Fallbackswitch not work with glupload2023-04-05T15:28:46ZManuel SchärerFallbackswitch not work with gluploadI want to use the gstreamer plugin fallbacksrc on linux.
fallbacksrc (with ximagesink) -> OK
```
gst-launch-1.0 fallbacksrc \
uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm \
fallback-uri=...I want to use the gstreamer plugin fallbacksrc on linux.
fallbacksrc (with ximagesink) -> OK
```
gst-launch-1.0 fallbacksrc \
uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm \
fallback-uri=file:///path/to/some/jpg \
! videoconvert \
! ximagesink
```
It works as expected.
fallbacksrc (with glimagesink) -> NG
```
gst-launch-1.0 fallbacksrc \
uri=https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm \
fallback-uri=file:///path/to/some/jpg \
! videoconvert \
! glimagesink
```
It does not work.
Error message:
```
thread '<unnamed>' panicked at 'called `Result::unwrap()` on an `Err` value: Noformat', utils/fallbackswitch/src/fallbacksrc/imp.rs:1897:36
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
fatal runtime error: failed to initiate panic, error 1073887984
Aborted (core dumped)
```
===============================================
Another pipeline, that not work:
```
gst-launch-1.0 \
glvideomixer name=mix background=black ! nvvideoconvert interpolation-method=5 ! video/x-raw, format=NV12, width=1920, height=1080, framerate=25/1 ! nvoverlaysink overlay=0 sync=1 \
fallbacksrc uri=rtsp://admin:password@10.192.48.142 name=rtspsrc fallback-uri=file:///srv/images/camInterrupt.jpg immediate-fallback=1 min-latency=2000000000 timeout=2000000000 ! queue ! \
nvvideoconvert interpolation-method=5 ! \
queue2 ! switchInput.sink_0 \
input-selector name=switchInput ! glupload ! mix. \
multifilesrc location=/srv/fastData/overlay.svg caps=image/svg,width=1920,height=1080,framerate=1/1 ! rsvgdec ! video/x-raw,format=BGRA ! glupload ! mix.
```
But this work:
```
gst-launch-1.0 \
glvideomixer name=mix background=black ! nvvideoconvert interpolation-method=5 ! video/x-raw, format=NV12, width=1920, height=1080, framerate=25/1 ! nvoverlaysink overlay=0 sync=1 \
uridecodebin3 uri=rtsp://admin:password@10.192.48.142 name=rtspsrc ! queue ! \
nvvideoconvert interpolation-method=5 ! \
queue2 ! switchInput.sink_0 \
input-selector name=switchInput ! glupload ! mix. \
multifilesrc location=/srv/fastData/overlay.svg caps=image/svg,width=1920,height=1080,framerate=1/1 ! rsvgdec ! video/x-raw,format=BGRA ! glupload ! mix.
```
Thnks for your help.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2426rtspsrc vs fallbacksrc2023-03-28T15:25:54ZSOLOR HANrtspsrc vs fallbacksrcHi,
We loved the idea of the fallbacksrc and very needed. But our pipeline working with rtspsrc but not working with fallbacksrc and there is no errors.
`function create_rtsp_sources() {
echo "Rtsp sources creating"
for ((n = 0; n < ...Hi,
We loved the idea of the fallbacksrc and very needed. But our pipeline working with rtspsrc but not working with fallbacksrc and there is no errors.
`function create_rtsp_sources() {
echo "Rtsp sources creating"
for ((n = 0; n < $num_of_src; n++)); do
src_name="SRC_${n}" src_name="${!src_name}"
rtsp_sources+="rtspsrc async-handling=true location=$src_name name=source_$n message-forward=true ! \
decodebin3 ! \ queue name=custom_preprocess_q_$n leaky=no max-size-buffers=32 max-size-bytes=0 max-size-time=0 ! \ $decode_scale_elements ! videoconvert n-threads=8 ! \
video/x-raw,pixel-aspect-ratio=1/1 ! \ fun.sink_$n sid.src_$n ! \
queue name=comp_q_$n leaky=downstream max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! \
comp.sink_$n " #echo "$n -- $rtsp_sources \n\n" done } `
isnt the fallbacksrc is the same output of the rtspsrc ?
How we can replace the rtspsrc with fallbacksrc for above pipeline ?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2427List of condition variables without a surrounding loop2023-04-16T15:46:30ZMichael Grünermichael.gruner@ridgerun.comList of condition variables without a surrounding loopAs per @slomo request, I'm opening an independent issue to track the condition variables used without a loop. You may find the original discussion in !4086 .
Here's a `clang-query` script that finds `g_cond_wait` ran without a loop:
**...As per @slomo request, I'm opening an independent issue to track the condition variables used without a loop. You may find the original discussion in !4086 .
Here's a `clang-query` script that finds `g_cond_wait` ran without a loop:
**matcher.txt**:
```
# while loop with his body
let m1 compoundStmt(hasDescendant(whileStmt()))
# call to g_cond_wait that is not in a while body
let m2 callExpr(callee(functionDecl(hasName("g_cond_wait"))),unless(hasParent(m1))).bind("cond_wait_without_loop")
m m2
```
Call in the GStreamer root (requires parallel, clang-query and jq):
```shell
parallel clang-query -f ../matcher.txt -- `jq -r '.[].file' compile_commands.json`
```
This throws 300+ instances, not counting `g_cond_wait_until`. Many of them are false positives as noted by @slomo and @ndufresne
Here's the list that the script above generates.
[g_cond_wait_without_loop.txt](/uploads/d32a431a57b63e8599f3bf237d442adb/g_cond_wait_without_loop.txt)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2437Follow-up from "sdpdemux: Add support for RFC4570 SDP source filters"2023-03-30T13:37:19ZSeungha Yangseungha@centricular.comFollow-up from "sdpdemux: Add support for RFC4570 SDP source filters"The following discussion from !3485 should be addressed:
- [ ] @slomo started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3485#note_1666401): (+4 comments)
> If `retrieve-sender-address=true`...The following discussion from !3485 should be addressed:
- [ ] @slomo started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3485#note_1666401): (+4 comments)
> If `retrieve-sender-address=true`, you could implement negative filters around that. Not sure if there's a better wayhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2438vtdec: support P010_10LE output2023-03-31T15:57:14Zbkarasmvtdec: support P010_10LE outputI'm trying to add 10-bit output in the applemedia/vtdec element for H265 decoding. I identified that pixel format `kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange` corresponds to `GST_VIDEO_FORMAT_P010_10LE` so I added necessary convers...I'm trying to add 10-bit output in the applemedia/vtdec element for H265 decoding. I identified that pixel format `kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange` corresponds to `GST_VIDEO_FORMAT_P010_10LE` so I added necessary conversions and everything works fine as long as I don't use `GLMemory`.
However, when I use GLMemory and pass decoded pictures to `glimagesink` the video gets corrupted on the conversion to `RGBA` in `glcolorconvert` element and all I can see is green rectangle that fills entire display window. Here is the pipeline that I use:
`gst-launch-1.0 filesrc location=file.h265 ! h265parse ! vtdec ! "video/x-raw(memory:GLMemory),format=(string)P010_10LE" ! glimagesink`
Interestingly, if I dump vtdec output to file (see pipeline below) and open it with a raw YUV video player the video looks ok.
`gst-launch-1.0 filesrc location=file.h265 ! h265parse ! vtdec ! "video/x-raw(memory:GLMemory),format=(string)P010_10LE" ! filesink location=file.raw`
I'd appreciate any suggestions what I might be missing.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2439openh264dec: Potential memory corruption when writing output buffers2023-04-20T21:27:39ZPhilippe Normandopenh264dec: Potential memory corruption when writing output buffersThis comes from @mcatanzaro, when browsing with Epiphany TP on wunderground.com
I wasn't able to reproduce it yet with `gst-play-1.0`, so the issue is not confirmed to be a bug in GStreamer, but while reading the backtrace, thread 41 in...This comes from @mcatanzaro, when browsing with Epiphany TP on wunderground.com
I wasn't able to reproduce it yet with `gst-play-1.0`, so the issue is not confirmed to be a bug in GStreamer, but while reading the backtrace, thread 41 in particular led me to read the code of `gst_openh264dec_handle_frame()`, specially towards the end when writing the frame output buffer.
We allocate the output frame and then map it to write the YUV data. The doc of `gst_video_decoder_allocate_output_frame()` specifies the output buffer "is owned by the frame and you should only keep references to the frame, not the buffer" but IIUC we do ref the buffer when mapping it with `gst_video_frame_map()`. Shouldn't we use the `GST_VIDEO_FRAME_MAP_FLAG_NO_REF` when mapping the video-frame?
Also, when un-maping, we unmap the codec state and then the frame. Shouldn't this order be reversed?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2441vah264dec: vah264enc: support VAProfileH264High102023-04-03T05:30:31ZVíctor Manuel Jáquez Lealvah264dec: vah264enc: support VAProfileH264High10https://github.com/intel/libva/pull/664https://github.com/intel/libva/pull/664https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2442GstInfo: Gst._debug_remove_log_function not working for user generated log fu...2023-04-03T12:26:47ZJuan Manuel ConsoliGstInfo: Gst._debug_remove_log_function not working for user generated log functions Python### Describe your issue
<!-- a clear and concise summary of the bug. -->
<!-- For any GStreamer usage question, please contact the community using the #gstreamer channel on IRC https://www.oftc.net/ or the mailing list on https://gstream...### Describe your issue
<!-- a clear and concise summary of the bug. -->
<!-- For any GStreamer usage question, please contact the community using the #gstreamer channel on IRC https://www.oftc.net/ or the mailing list on https://gstreamer.freedesktop.org/lists/ -->
When trying to remove a user log function using the Python bindings, I always get 0 as a result of Gst.debug_remove_log_function()
Test code
```
import gi
gi.require_version("GLib", "2.0")
gi.require_version("GObject", "2.0")
gi.require_version("Gst", "1.0")
from gi.repository import Gst, GLib
Gst.init(None)
def custom_log_function(category, level, file, function, line, obj, message, user_data: LogFile):
Some Code
Gst.debug_remove_log_function(None)
#Outputs 1 as expected (removing default log function)
Gst.debug_add_log_function(custom_log_function, None)
Gst.debug_remove_log_function(custom_log_function)
#Outputs 0, (custom_log_function doesn't get removed)
```
#### Expected Behavior
<!-- What did you expect to happen -->
Gst.debug_remove_log_function(custom_log_function) should return 1
#### Observed Behavior
<!-- What actually happened -->
Gst.debug_remove_log_function(custom_log_function) returns 0
#### Setup
- **Operating System:** Ubuntu 22.04
- **Device:** Computer <!-- Delete as appropriate !-->
- **GStreamer Version:** 1.0
- **Command line:**
### Steps to reproduce the bug
<!-- please fill in exact steps which reproduce the bug on your system, for example: -->
1. register a custom log function with Gst.debug_add_log_function(custom_log_function, None)
2. remove it with Gst.debug_remove_log_function(custom_log_function)
### How reproducible is the bug?
<!-- The reproducibility of the bug is Always/Intermittent/Only once after doing a very specific set of steps-->
Always
### Additional Information
<!-- Any other information such as logs. Make use of <details> for long output -->
I also tried removing it with user_data and it had the same effecthttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2444cuda context problems after gstreamer update to 1.222023-04-04T16:12:58ZArthur Khairullincuda context problems after gstreamer update to 1.22I have a project working with Cuda to run neural networks. Also i use gstreamer pipelines using nvh264dec block there. My working environment was - cuda11.4 based docker image with gstreamer 1.20.0 + plugins installed there.
Now i'm try...I have a project working with Cuda to run neural networks. Also i use gstreamer pipelines using nvh264dec block there. My working environment was - cuda11.4 based docker image with gstreamer 1.20.0 + plugins installed there.
Now i'm trying to migrate to gstreamer 1.22.0. But first of my detection modules fails with CUDA_INVALID_CONTEXT error. Others start successfully. I see that nvinfer1::ICudaEngine object of the first module returns nullptr from createExecutionContext call.
As i said in gstreamer 1.20.0 i have never had any problem.
I noticed that command 'gst-launch-1.0 filesrc location=10.200.41.198.mp4 ! qtdemux ! h264parse ! nvh264dec ! queue ! "video/x-raw" ! fakesink'
returns 'Got context from element 'nvh264dec0': gst.cuda.context=context, gst.cuda.context=(GstCudaContext)"\(GstCudaContext\)\ cudacontext1", cuda-device-id=(uint)0;' on gstreamer 1.22.0 (context1 !)
while
'Got context from element 'nvh264dec0': gst.cuda.context=context, gst.cuda.context=(GstCudaContext)"\(GstCudaContext\)\ cudacontext0", cuda-device-id=(int)0;' on gstreame 1.20.0 (context0 !)
I don't know if that is important or not - it's just an observation.
Could anyone help me with that problem? Any help will be appreciated. Thanks in advance.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2446encodebin: Make usage of timestamper element configurable2023-04-04T21:08:22ZThibault Sauniertsaunier@igalia.comencodebin: Make usage of timestamper element configurableThe following discussion from !3779 should be addressed:
- [ ] @ndufresne started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3779#note_1737139): (+1 comment)
> I believe the timestamper adds...The following discussion from !3779 should be addressed:
- [ ] @ndufresne started a [discussion](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3779#note_1737139): (+1 comment)
> I believe the timestamper adds latency regardless if timestamps are needed or not, making the timestamper unwanted for WebRTC use cases.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2448gst_audio_converter_convert() has broken API2023-04-06T07:09:37ZSebastian Drögegst_audio_converter_convert() has broken API * The XOR assertion makes no sense. What it wants to check probably is whether the flag is set and forbid that, but I think that's not required as it only signals that the input can be used as temporary scrap space
* It will only work ... * The XOR assertion makes no sense. What it wants to check probably is whether the flag is set and forbid that, but I think that's not required as it only signals that the input can be used as temporary scrap space
* It will only work for interleaved in/out as otherwise the in/out pointers would be an array of arrays with num-channel elements
We should fix the assertion, add an assertion for interleaved, deprecate the old function and add a new non-broken one.Mathieu DuponchelleMathieu Duponchellehttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2449playbin3 no longer uses contexts from the sink2024-02-09T13:50:30ZMichael Olbrichplaybin3 no longer uses contexts from the sinkWith GStreamer 1.22.0 playbin3 no longer uses the contexts provided by the sink. It's easy to test on hardware with VA-API support:
```
GST_DEBUG=vaapisink:2 gst-launch-1.0 playbin3 uri=file://some/file/that/will/use/a/vaapi/decoder vide...With GStreamer 1.22.0 playbin3 no longer uses the contexts provided by the sink. It's easy to test on hardware with VA-API support:
```
GST_DEBUG=vaapisink:2 gst-launch-1.0 playbin3 uri=file://some/file/that/will/use/a/vaapi/decoder video-sink=vaapisink
```
This fails a "Internal data stream error." and the vaapisink reports:
```
0:00:00.363445698 162906 0x7f29c402f9e0 WARN vaapisink gstvaapisink.c:1557:gst_vaapisink_show_frame_unlocked:<vaapisink0> incoming surface has different VAAPI Display
```
From what I can tell, this was broken in 6bffbe283ad6a662753fc8164fd8efd9d80d106e. Before that commit, playbin3 explicitly activated the sink and collected all provided contexts. Now this no longer happens and the decoder uses a different context than the sink.
@bilboed, any advice on how to fix this?Edward HerveyEdward Herveyhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2450rtsp server stalls with v4l2h264enc on raspberry pi 42023-04-11T14:21:47ZAndressio Essiortsp server stalls with v4l2h264enc on raspberry pi 4Hi
I'm writing a rtsp application using gst-rtsp-server on raspberry pi 4 and Linux MATE 64bit SO.
Starting from _test-appsrc.cpp_ example I tried the following pipeline:
`appsrc name=mysrc ! videoconvert ! video/x-raw,format=I420 ! x264...Hi
I'm writing a rtsp application using gst-rtsp-server on raspberry pi 4 and Linux MATE 64bit SO.
Starting from _test-appsrc.cpp_ example I tried the following pipeline:
`appsrc name=mysrc ! videoconvert ! video/x-raw,format=I420 ! x264enc ! rtph264pay name=pay0 pt=96`
and it works as expected: a VLC client is able to open and play the rtsp stream.
In the next step I tried to change the soft-h264encoder with `v4l2h264enc` element but it failed. The stream can be opened from a rtsp client but after about a second the server stalls and the stream stops.
With the same pipeline, if I change the `appsrc` with a `videotestsrc is-live=true` the streams works well also with `v4l2h264enc`.
I attach my code:
```
#include <gst/gst.h>
#include <gst/rtsp-server/rtsp-server.h>
#include <iostream>
typedef struct
{
gboolean white;
GstClockTime timestamp;
} MyContext;
/* called when we need to give data to appsrc */
static void
need_data (GstElement * appsrc, guint unused, MyContext * ctx)
{
GstBuffer *buffer;
guint size;
GstFlowReturn ret;
size = 640 * 480 * 3;
buffer = gst_buffer_new_allocate (NULL, size, NULL);
/* this makes the image black/white */
gst_buffer_memset (buffer, 0, ctx->white ? 0xff : 0x0, size);
std::cout << "Entro: " << ctx->white << std::endl;
ctx->white = !ctx->white;
/* increment the timestamp every 1/30 second */
GST_BUFFER_PTS (buffer) = ctx->timestamp;
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 30);
ctx->timestamp += GST_BUFFER_DURATION (buffer);
g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
gst_buffer_unref (buffer);
}
/* called when a new media pipeline is constructed. We can query the
* pipeline and configure our appsrc */
static void
media_configure (GstRTSPMediaFactory * factory, GstRTSPMedia * media,
gpointer user_data)
{
GstElement *element, *appsrc;
MyContext *ctx;
/* get the element used for providing the streams of the media */
element = gst_rtsp_media_get_element (media);
/* get our appsrc, we named it 'mysrc' with the name property */
appsrc = gst_bin_get_by_name_recurse_up (GST_BIN (element), "mysrc");
/* this instructs appsrc that we will be dealing with timed buffer */
gst_util_set_object_arg (G_OBJECT (appsrc), "format", "time");
ctx = g_new0 (MyContext, 1);
ctx->white = FALSE;
ctx->timestamp = 0;
/* make sure ther datais freed when the media is gone */
g_object_set_data_full (G_OBJECT (media), "my-extra-data", ctx,
(GDestroyNotify) g_free);
/* install the callback that will be called when a buffer is needed */
g_signal_connect (appsrc, "need-data", (GCallback) need_data, ctx);
gst_object_unref (appsrc);
gst_object_unref (element);
}
int
main (int argc, char *argv[])
{
GMainLoop *loop;
GstRTSPServer *server;
GstRTSPMountPoints *mounts;
GstRTSPMediaFactory *factory;
gst_init (&argc, &argv);
loop = g_main_loop_new (NULL, FALSE);
/* create a server instance */
server = gst_rtsp_server_new ();
/* get the mount points for this server, every server has a default object
* that be used to map uri mount points to media factories */
mounts = gst_rtsp_server_get_mount_points (server);
/* make a media factory for a test stream. The default media factory can use
* gst-launch syntax to create pipelines.
* any launch line works as long as it contains elements named pay%d. Each
* element with pay%d names will be a stream */
factory = gst_rtsp_media_factory_new ();
gst_rtsp_media_factory_set_launch (factory,
"( appsrc is-live=true name=mysrc ! video/x-raw, format=(string)RGB, width=(int)640, height=(int)480, framerate=(fraction)30/1 ! videoconvert ! v4l2h264enc ! video/x-h264, level=(string)4 ! rtph264pay name=pay0 pt=96 )");
/* notify when our media is ready, This is called whenever someone asks for
* the media and a new pipeline with our appsrc is created */
g_signal_connect (factory, "media-configure", (GCallback) media_configure,
NULL);
/* attach the test factory to the /test url */
gst_rtsp_mount_points_add_factory (mounts, "/test", factory);
/* don't need the ref to the mounts anymore */
g_object_unref (mounts);
/* attach the server to the default maincontext */
gst_rtsp_server_attach (server, NULL);
/* start serving */
g_print ("stream ready at rtsp://127.0.0.1:8554/test\n");
g_main_loop_run (loop);
return 0;
}
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2451Audiomixer do not forward tags2023-04-06T19:49:16ZJackAudiomixer do not forward tagsHello,
With Python, I have two pipelines :
1) uridecodebin ! audioconvert ! audioresample ! volume ! tee ! queue !
lamemp3enc ! shout2send
2) uridecodebin ! audioconvert ! audioresample ! volume ! audiomixer ! tee
! queue ! lamemp3e...Hello,
With Python, I have two pipelines :
1) uridecodebin ! audioconvert ! audioresample ! volume ! tee ! queue !
lamemp3enc ! shout2send
2) uridecodebin ! audioconvert ! audioresample ! volume ! audiomixer ! tee
! queue ! lamemp3enc ! shout2send
With the first pipeline, i can get tags on Icecast server, this is not the case with the second pipeline.
A problem with audiomixer not forwarding tags ?
My Setup :
Ubuntu 22.04
GStreamer 1.20.3
Python 3.10.6
++
Jackhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2452decodebin3: Check compatibility of candidate decoder with caps2023-12-15T15:01:52ZQi Houdecodebin3: Check compatibility of candidate decoder with capsThere is one case that audio and video decoders are both created successfully, then multiqueue in decodebin3 try to push first audio buffer to audio decoder avdec_aac.
But audio decoder sink_event() returns 0 (means fail) when handling ...There is one case that audio and video decoders are both created successfully, then multiqueue in decodebin3 try to push first audio buffer to audio decoder avdec_aac.
But audio decoder sink_event() returns 0 (means fail) when handling sticky event GST_EVENT_CAPS, which makes multiqueue push buffer return GST_FLOW_NOT_NEGOTIATED. Total pipeline paused and exit.
Expected result is to remove audio track and carry on decoding video track.
Playbin2 can obtain expected result. This is because playbin2 has one function named as send_sticky_events() to specially handle events. If got same GST_FLOW_NOT_NEGOTIATED as playbin3, playbin2 just remove audio related elements and let video carry on decoding.
Is there any plan to fix this on decodebin3/playbin3 ?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2455directshow plugin unable to compile on VS due to library linking errors2023-04-09T20:52:02ZElliott Velascodirectshow plugin unable to compile on VS due to library linking errorsHello, I am wondering what are the necessary libraries to link against when building the directshow plugin? I checked the meson.build file in the plugin directory and I do have all the libraries listed under dshow_deps[].
This an examp...Hello, I am wondering what are the necessary libraries to link against when building the directshow plugin? I checked the meson.build file in the plugin directory and I do have all the libraries listed under dshow_deps[].
This an example of the errors I'm getting
Error LNK2001 unresolved external symbol "public: virtual long __cdecl CBaseFilter::JoinFilterGraph"
However when looking around this is part of the strmbase.lib which I already have linked.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2457Delay with kicking on ONVIF backchannel.2023-04-07T15:08:46ZJordan BradshawDelay with kicking on ONVIF backchannel.I've been testing with around 10 different VMS companies. My experience is that when doing RTP/RTSP/TCP the talkback requires around 2 seconds before the connection is active for data to get pushed through. Is there any optimizations on ...I've been testing with around 10 different VMS companies. My experience is that when doing RTP/RTSP/TCP the talkback requires around 2 seconds before the connection is active for data to get pushed through. Is there any optimizations on the RTSP server that I can do that will improve performance regarding this? Sorry if this isn't the place for this but idk where else to ask regarding the ONVIF specific information of gstreamer.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2458qmlglsink: Feature request: implement QAbstractVideoFilter-like feature in Gs...2023-04-07T09:19:43ZVincas Dargisqmlglsink: Feature request: implement QAbstractVideoFilter-like feature in GstGLVideoItem for convenient way for getting QImages frome the samplesIn Qt Multimedia there's a way to install a filter https://doc.qt.io/qt-5/qabstractvideofilter.html into Qml `VideoOutput` which allows acquiring frames as `QImage`s and then perform some additional processing, like passing that image to...In Qt Multimedia there's a way to install a filter https://doc.qt.io/qt-5/qabstractvideofilter.html into Qml `VideoOutput` which allows acquiring frames as `QImage`s and then perform some additional processing, like passing that image to OpenCV algorithms.
Currently I presume one would have to modify pipeline to use `tee` and `appsink` and get frames via it, or to poll `last-sample` property using timer, after determining what is frame rate of the pipeline..?
Having optional mode there `GstGLVideoItem` would emit `QImage` on every sample or by introducing analogous "filter" runnable that would not block the pipeline, would make it much easier, without need to modify pipeline or do some other "gymnastics".