GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2021-09-13T05:30:02Zhttps://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1558rtmp2sink: authentication is broken2021-09-13T05:30:02ZArun Raghavanrtmp2sink: authentication is brokenWith the addition of MR !1862, authentication with `rtmp2sink` is rendered broken. The reason for this is that the authentication process can involve the creation of more than one `GstRtmpConnection` -- the first connection results in an...With the addition of MR !1862, authentication with `rtmp2sink` is rendered broken. The reason for this is that the authentication process can involve the creation of more than one `GstRtmpConnection` -- the first connection results in an 403, the second fails with a challenge, and then finally a third connection succeeds.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/146rav1e: Update to latest encoder / configuration API2022-09-16T09:08:02ZSebastian Drögerav1e: Update to latest encoder / configuration APIAnd update properties accordingly.
There's also a channels based API that might fit better the usage pattern here.And update properties accordingly.
There's also a channels based API that might fit better the usage pattern here.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/145rav1e: Negotiates to Y444_12LE by default due to caps ordering2021-03-25T15:58:22ZSebastian Drögerav1e: Negotiates to Y444_12LE by default due to caps orderingAnd that's of course slow. Not sure how to solve that bestAnd that's of course slow. Not sure how to solve that besthttps://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/issues/320RTSPSRC TCP EOS issue2021-03-25T16:07:53ZMike HainesRTSPSRC TCP EOS issueI am connected to an ip camera live.
When property protocols is set for TCP, every 8 seconds I receive an EOS signal. When not set to TCP the signal is not sent.
let src = gst::ElementFactory::make("rtspsrc", None).map_err(|_| Mis...I am connected to an ip camera live.
When property protocols is set for TCP, every 8 seconds I receive an EOS signal. When not set to TCP the signal is not sent.
let src = gst::ElementFactory::make("rtspsrc", None).map_err(|_| MissingElement("filesink")).unwrap();
src.set_property("location", &uri)
.expect("setting rtspsrc location property failed");
src.set_property("latency", &3000u32)
.expect("setting rtspsrc latency property failed");
src.set_property("timeout", &8000000u64)
.expect("setting rtspsrc timeout property failed");
src.set_property_from_str("protocols", "tcp");https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/677systemclock: Deadlock caused by possible regression2021-04-02T05:37:19ZPieter Jordaansystemclock: Deadlock caused by possible regressionHi
I discovered a deadlock scenario in systemclock introduced by @bilboed commit https://gitlab.freedesktop.org/gstreamer/gstreamer/-/commit/17feeb1bd6f4fa31d050054a6f3adb04816228d9
The code in [line 972](https://gitlab.freedesktop.org...Hi
I discovered a deadlock scenario in systemclock introduced by @bilboed commit https://gitlab.freedesktop.org/gstreamer/gstreamer/-/commit/17feeb1bd6f4fa31d050054a6f3adb04816228d9
The code in [line 972](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/commit/17feeb1bd6f4fa31d050054a6f3adb04816228d9#e85b10a2fbdeee3b935011d3dd8a11f07b352c8b_969_972) makes the assumption that the end time in the absolute sleep is from the MONOTONIC clock. When the pipeline is set to use REALTIME (or something else), however, the sleep never wakes (in this lifetime), causing sync items to deadlock.
In my test case
```
{tv_sec = 1616671370, tv_nsec = 913332245}
```
was the end time waited on, but my monotonic clock (minutes later) was at `tv_sec = 617953`.
I see the following possible fixes/workarounds
1. Don't set the pipeline to REALTIME and manually do the realtime calculations on my end
2. Use the same CLOCK type to sleep for absolute time
3. Calculate the proper MONITONIC end time, instead of the pipeline clock's end time.
4. Skip the nanosleep optimisation when not on the MONOTONIC clock
Here is a small example I used to test my theory:
```
#include <ctime>
#include <iostream>
#include <time.h>
using namespace std;
int main() {
timespec startreal, startmonotonic;
clock_gettime(CLOCK_REALTIME, &startreal);
clock_gettime(CLOCK_MONOTONIC, &startmonotonic);
cout << "MONO: " << startmonotonic.tv_sec << " SYS: " << startreal.tv_sec << endl;
timespec end = startreal;
end.tv_sec += 5;
clock_nanosleep(CLOCK_MONOTONIC, TIMER_ABSTIME, &end, NULL );
clock_gettime(CLOCK_REALTIME, &startreal);
clock_gettime(CLOCK_MONOTONIC, &startmonotonic);
cout << "AFTERMONO: " << startmonotonic.tv_sec << " SYS: " << startreal.tv_sec << endl;
return 0;
}
```
Obviously AFTERMONO is not printed after 5s...
I'm happy to make a pull request when you recommend the correct approach to follow from my or your suggestionshttps://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/144rav1e: Enable asm feature and integrate into meson2021-03-25T11:07:53ZSebastian Drögerav1e: Enable asm feature and integrate into mesonCC @alatiera
This probably needs to become a cargo feature proxied by the plugin to the `rave1` crate. `meson.build` should enable it only if it exist.CC @alatiera
This probably needs to become a cargo feature proxied by the plugin to the `rave1` crate. `meson.build` should enable it only if it exist.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/676Failed to use uridecodebin to process hls(.m3u8) stream2021-09-24T18:12:19Zchauncey wangFailed to use uridecodebin to process hls(.m3u8) streamI am trying the audio classification sample from deepstream-audio.
It works good with wav as input. But after changing the input to hls video. It fails to give prediction.
From my understanding, uridecodebin will automatically output th...I am trying the audio classification sample from deepstream-audio.
It works good with wav as input. But after changing the input to hls video. It fails to give prediction.
From my understanding, uridecodebin will automatically output the stream in format required by following module.
In my case, uridecodebin should demux, parse acc, convert and resample.
The final out put of uridecodebin should be raw mono audio w/ rate of 44100 and format of S16LE.
The dumped dot graph seem as my wish, while the log shows error.
Any suggestion to dubug will be appreciated :-)
0:00:02.019216225 11118 0x55bda40810a0 WARN GST_PADS gstpad.c:4226:gst_pad_peer_query:<src_elem:src_0> could not send sticky events
0:00:02.019310877 11118 0x55bda40810a0 WARN v4l2videodec gstv4l2videodec.c:1673:gst_v4l2_video_dec_decide_allocation:<nvv4l2decoder0> Duration invalid, not setting latency
0:00:02.019334836 11118 0x55bda40810a0 WARN v4l2bufferpool gstv4l2bufferpool.c:1066:gst_v4l2_buffer_pool_start:<nvv4l2decoder0:pool:src> Uncertain or not enough buffers, enabling copy threshold
** INFO: <bus_callback:123>: Pipeline running
0:00:02.020388536 11118 0x55bda4035c50 WARN audio-resampler audio-resampler.c:274:convert_taps_gint16_c: can't find exact taps
max_fps_dur 8.33333e+06 min_fps_dur 2e+08
0:00:02.020579416 11118 0x55bda4035c50 WARN nvstreammux gstnvstreammux.cpp:742:configure_module:<src_bin_muxer> No config-file provided; falling back to default streammux config 1
0:00:02.020597766 11118 0x55bda40816d0 WARN v4l2bufferpool gstv4l2bufferpool.c:1513:gst_v4l2_buffer_pool_dqbuf:<nvv4l2decoder0:pool:src> Driver should never set v4l2_buffer.field to ANY
**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
0:00:08.530273525 11118 0x55bda4038630 WARN mpegts gstmpegtssection.c:161:__common_section_checks: PID:0x0000 table_id:0x00, Bad CRC on section
**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1557geometrictransform:perspective2021-09-24T14:39:14ZLuke Kgeometrictransform:perspectiveI seem to be unable to set the "matrix" array property using gst-launch. It expects a GValueArray of 9 gdouble.
It should be changed to be a GstValueArray to work from gst-launch I am told.I seem to be unable to set the "matrix" array property using gst-launch. It expects a GValueArray of 9 gdouble.
It should be changed to be a GstValueArray to work from gst-launch I am told.https://gitlab.freedesktop.org/gstreamer/gst-docs/-/issues/85Link text is missing2023-05-24T12:31:16ZClyde McQueenLink text is missingIf this is the wrong spot to record this issue, please redirect.
I've noticed that a number of gstreamer plugin documentation pages are missing link text. E.g., in:
https://gstreamer.freedesktop.org/documentation/vaapi/vaapih264enc.html...If this is the wrong spot to record this issue, please redirect.
I've noticed that a number of gstreamer plugin documentation pages are missing link text. E.g., in:
https://gstreamer.freedesktop.org/documentation/vaapi/vaapih264enc.html?gi-language=c
The 2nd paragraph of the body reads (I've marked missing text with **MISSING**):
The **MISSING** property controls the type of encoding. In case of Constant Bitrate Encoding (CBR), the **MISSING** will determine the quality of the encoding. Alternatively, one may choose to perform Constant Quantizer or Variable Bitrate Encoding (VBR), in which case the **MISSING** is the maximum bitrate.
The html delivered to my browser is:
~~~
<p>The <a href="GstVaapiEncodeH264:rate-control"></a> property controls the type of
encoding. In case of Constant Bitrate Encoding (CBR), the
<a href="GstVaapiEncodeH264:bitrate"></a> will determine the quality of the
encoding. Alternatively, one may choose to perform Constant
Quantizer or Variable Bitrate Encoding (VBR), in which case the
<a href="GstVaapiEncodeH264:bitrate"></a> is the maximum bitrate.</p>
~~~
This same problem appears on a number of pages. My workaround is to examine the HTML.
Thanks.Mathieu DuponchelleMathieu Duponchellehttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/675Changing last element in a pipeline2022-11-10T09:21:06ZMaxim MaximChanging last element in a pipeline[GStreamer Doc](https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html#changing-elements-in-a-pipeline)
This tutorial looks at the case of changing the middle element in the pipeline....[GStreamer Doc](https://gstreamer.freedesktop.org/documentation/application-development/advanced/pipeline-manipulation.html#changing-elements-in-a-pipeline)
This tutorial looks at the case of changing the middle element in the pipeline. But I cannot change the end element in the pipeline. Help me please.
```
#include <iostream>
#include <queue>
#include <string>
#include <boost/format.hpp>
#include <gst/gst.h>
#include "GStreamer/Utils.h"
GstPad* blockpad_1;
GstPad* blockpad_2;
GstElement* before_1;
GstElement* after_1;
GstElement* before_2;
GstElement* cur_effect;
GstElement* pipeline;
GstElement* sink_1;
GstElement* sink_2;
GstElement* q1;
GstElement* q2;
GstElement* q3;
std::queue<GstElement*> effects;
using namespace mongoose;
GstPadProbeReturn event_probe_cb_1(GstPad* pad, GstPadProbeInfo* info, gpointer user_data) {
GMainLoop* loop = static_cast<GMainLoop*>(user_data);
GstElement* before = before_1;
GstElement* current = cur_effect;
GstElement* next = effects.front();
GstElement* after = after_1;
if (GST_EVENT_TYPE(GST_PAD_PROBE_INFO_DATA(info)) != GST_EVENT_EOS) {
return GST_PAD_PROBE_PASS;
}
gst_pad_remove_probe(pad, GST_PAD_PROBE_INFO_ID(info));
gst_object_ref(current);
effects.push(current);
effects.pop();
if (next == nullptr) {
g_main_loop_quit(loop);
return GST_PAD_PROBE_DROP;
}
auto msg = boost::format("Switching from '%s' to '%s'")
% GST_OBJECT_NAME(current)
% GST_OBJECT_NAME(next);
std::cout << msg << std::endl;
gst_element_set_state(current, GST_STATE_NULL);
gst_bin_remove(GST_BIN(pipeline), current);
gst_bin_add(GST_BIN(pipeline), next);
gst_element_link_many(before, next, after, NULL);
gst_element_set_state(next, GST_STATE_PLAYING);
cur_effect = next;
return GST_PAD_PROBE_DROP;
}
GstPadProbeReturn event_probe_cb_2(GstPad* pad, GstPadProbeInfo* info, gpointer user_data) {
GMainLoop* loop = static_cast<GMainLoop*>(user_data);
GstElement* before = before_2;
GstElement* current = sink_1;
GstElement* next = sink_2;
if (GST_EVENT_TYPE(GST_PAD_PROBE_INFO_DATA(info)) != GST_EVENT_EOS) {
return GST_PAD_PROBE_PASS;
}
gst_pad_remove_probe(pad, GST_PAD_PROBE_INFO_ID(info));
gst_object_ref(current);
if (next == nullptr) {
g_main_loop_quit(loop);
return GST_PAD_PROBE_DROP;
}
auto msg = boost::format("Switching from '%s' to '%s'")
% GST_OBJECT_NAME(current)
% GST_OBJECT_NAME(next);
std::cout << msg << std::endl;
gst_element_set_state(current, GST_STATE_NULL);
gst_bin_remove(GST_BIN(pipeline), current);
gst_bin_add(GST_BIN(pipeline), next);
gst_element_link_many(before, next, NULL);
gst_element_set_state(next, GST_STATE_PLAYING);
return GST_PAD_PROBE_DROP;
}
GstPadProbeReturn pad_probe_cb_1(GstPad* pad, GstPadProbeInfo* info, gpointer user_data) {
gst_pad_remove_probe(pad, GST_PAD_PROBE_INFO_ID(info));
gst::Pad srcpad(gst_element_get_static_pad(cur_effect, "src"));
gst_pad_add_probe(
srcpad.get(),
GstPadProbeType(GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM),
event_probe_cb_1,
user_data,
NULL);
gst::Pad sinkpad(gst_element_get_static_pad(cur_effect, "sink"));
gst_pad_send_event(sinkpad.get(), gst_event_new_eos());
return GST_PAD_PROBE_OK;
}
GstPadProbeReturn pad_probe_cb_2(GstPad* pad, GstPadProbeInfo* info, gpointer user_data) {
gst_pad_remove_probe(pad, GST_PAD_PROBE_INFO_ID(info));
gst::Pad srcpad(gst_element_get_static_pad(sink_1, "sink"));
gst_pad_add_probe(
srcpad.get(),
GstPadProbeType(GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM),
event_probe_cb_2,
user_data,
NULL);
gst::Pad sinkpad(gst_element_get_static_pad(q3, "sink"));
gst_pad_send_event(sinkpad.get(), gst_event_new_eos());
return GST_PAD_PROBE_OK;
}
gboolean timeout_cb_1(gpointer user_data) {
gst_pad_add_probe(
blockpad_1,
GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,
pad_probe_cb_1,
user_data,
NULL);
return TRUE;
}
gboolean timeout_cb_2(gpointer user_data) {
gst_pad_add_probe(
blockpad_2,
GST_PAD_PROBE_TYPE_BLOCK_DOWNSTREAM,
pad_probe_cb_2,
user_data,
NULL);
return FALSE;
}
gboolean bus_cb(GstBus* /*bus*/, GstMessage* msg, gpointer user_data) {
GMainLoop* loop = static_cast<GMainLoop*>(user_data);
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_ERROR: {
GError* err = NULL;
gchar* dbg;
gst_message_parse_error(msg, &err, &dbg);
gst_object_default_error(msg->src, err, dbg);
g_clear_error(&err);
g_free(dbg);
g_main_loop_quit(loop);
break;
}
default:
break;
}
return TRUE;
}
void work() {
std::vector<char const*> effect_names {
"identity",
"exclusion",
"navigationtest",
"agingtv",
"videoflip",
"vertigotv",
"gaussianblur",
"shagadelictv",
"edgetv"
};
gst_init(nullptr, nullptr);
for (auto const& e : effect_names) {
auto element = gst::create_element(e);
std::cout << "Effect: " << e << std::endl;
effects.push(element);
}
pipeline = gst::pipeline_new();
GstElement* src = gst::create_element("videotestsrc");
g_object_set(src, "is-live", TRUE, NULL);
GstElement* filter1 = gst::create_element("capsfilter");
gst_util_set_object_arg(
G_OBJECT(filter1),
"caps",
"video/x-raw, width=320, height=240, "
"format={ I420, YV12, YUY2, UYVY, AYUV, Y41B, Y42B, "
"YVYU, Y444, v210, v216, NV12, NV21, UYVP, A420, YUV9, YVU9, IYU1 }");
q1 = gst::create_element("queue");
q2 = gst::create_element("queue");
q3 = gst::create_element("queue");
blockpad_1 = gst_element_get_static_pad(q1, "src");
blockpad_2 = gst_element_get_static_pad(q3, "src");
before_1 = gst::create_element("videoconvert");
GstElement* effect = effects.front();
effects.pop();
cur_effect = effect;
after_1 = gst::create_element("videoconvert");
GstElement* filter2 = gst::create_element("capsfilter");
gst_util_set_object_arg(
G_OBJECT(filter2),
"caps",
"video/x-raw, width=320, height=240, "
"format={ RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR }");
GstElement* encoder = gst::create_element("x264enc");
GstElement* muxer = gst::create_element("mpegtsmux");
before_2 = q3;
//sink_1 = gst::create_element("autovideosink");
sink_1 = gst::create_element("filesink");
g_object_set(G_OBJECT(sink_1), "location", "/data/output_3.mp4", NULL);
sink_2 = gst::create_element("filesink");
g_object_set(G_OBJECT(sink_2), "location", "/data/output_4.mp4", NULL);
gst_bin_add_many(
GST_BIN(pipeline),
src,
filter1,
q1,
before_1,
effect,
after_1,
q2,
encoder,
muxer,
q3,
sink_1,
NULL);
gst_element_link_many(
src,
filter1,
q1,
before_1,
effect,
after_1,
q2,
encoder,
muxer,
q3,
sink_1,
NULL);
gst_element_set_state(pipeline, GST_STATE_PLAYING);
GMainLoop* loop = g_main_loop_new(NULL, FALSE);
gst_bus_add_watch(GST_ELEMENT_BUS(pipeline), bus_cb, loop);
//g_timeout_add_seconds(1, timeout_cb_1, loop);
g_timeout_add_seconds(3, timeout_cb_2, loop);
g_main_loop_run(loop);
gst_element_set_state(pipeline, GST_STATE_NULL);
gst_object_unref(pipeline);
}
int main() {
try {
work();
} catch (std::exception const& e) {
std::cerr << e.what() << std::endl;
} catch (...) {
std::cerr << "Undefined exception!" << std::endl;
}
return 0;
}
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/674Streaming the video from SPI source2022-11-10T09:21:06Z許連逢Streaming the video from SPI sourceHello,
I am attempting to get the video bitstream from SPI and I found that there is no solution for SPI source currently.
Is there any way to achieve this? Or I need to develop my own plugin by following the tutorial?
Thank you.Hello,
I am attempting to get the video bitstream from SPI and I found that there is no solution for SPI source currently.
Is there any way to achieve this? Or I need to develop my own plugin by following the tutorial?
Thank you.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/143Need help for custom S3 endpoint2021-03-24T08:35:07Zvuonghoainam81096Need help for custom S3 endpointHi there! Currently we want to use gst-plugin-s3 with our [Minio](https://min.io)
However I cannot find a way to specify endpoint with gst-plugin-s3 as there is no endpoint_url environment variable option in Boto3
Please help us with a ...Hi there! Currently we want to use gst-plugin-s3 with our [Minio](https://min.io)
However I cannot find a way to specify endpoint with gst-plugin-s3 as there is no endpoint_url environment variable option in Boto3
Please help us with a way. Thank you very much.https://gitlab.freedesktop.org/gstreamer/gstreamer-sharp/-/issues/56Playback tutorial 4 error - stream is in the wrong format2021-09-24T10:46:36ZTomislav TustonicPlayback tutorial 4 error - stream is in the wrong formatI'm trying to run Playback tutorial 4 with 1.18.4 versions.
When I try to set download flag
(https://gitlab.freedesktop.org/gstreamer/gstreamer-sharp/-/blob/master/samples/PlaybackTutorial4.cs#L42)
I get the following errors:
```
...I'm trying to run Playback tutorial 4 with 1.18.4 versions.
When I try to set download flag
(https://gitlab.freedesktop.org/gstreamer/gstreamer-sharp/-/blob/master/samples/PlaybackTutorial4.cs#L42)
I get the following errors:
```
Error: The stream is in the wrong format.
../gst-libs/gst/audio/gstaudiobasesink.c(1117): gst_audio_base_sink_wait_event (): /GstPlayBin:playbin0/GstPlaySink:playsink/GstBin:abin/GstWasapiSink:wasapisink0:
Sink not negotiated before eos event.
```
There's no error if I put a local url, or some larger file, like:
https://download.blender.org/durian/trailer/sintel_trailer-1080p.ogv
https://dash.akamaized.net/akamai/bbb_30fps/bbb_30fps_1920x1080_8000k.mpdhttps://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/issues/319Comparisons to ClockTime::none() behave strangely2021-05-19T09:39:09ZNirbheek Chauhannirbheek.chauhan@gmail.comComparisons to ClockTime::none() behave strangelyThis was partially addressed in https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/607, but it's still an issue. Specifically, if you do:
```rust
// duration might be gst::ClockTime::none()
let duration = pipeline.qu...This was partially addressed in https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/607, but it's still an issue. Specifically, if you do:
```rust
// duration might be gst::ClockTime::none()
let duration = pipeline.query_duration::<gst::format::Time>().unwrap();
...
// This will always evaluate to true if duration is none()!
if start > duration {
error!("Invalid start");
}
```
I would either expect:
1) To be forced to check that duration is not `none()` before being allowed to do a comparison, or
2) To have `query_duration()` return `UINT64_MAX` or `inf` when the duration is unknown so that the comparison works
I suppose https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/issues/234 proposes (1).
CC: @fengalinhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2758mpegtsmux: Stream packets can be sent prior to first PAT/PMT2023-07-04T10:47:05ZAndrew Gallmpegtsmux: Stream packets can be sent prior to first PAT/PMTSince 1.18, by default the first stream created (based on the pad creation order) is considered the PCR stream. Yet this PCR stream may not be the first stream whose buffers' are muxed, which results in writing stream data prior to writi...Since 1.18, by default the first stream created (based on the pad creation order) is considered the PCR stream. Yet this PCR stream may not be the first stream whose buffers' are muxed, which results in writing stream data prior to writing the PAT/PMT.
In 1.16, despite the logic introduced by https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/commit/3f0463c43e2c61ba5509c9466dc2a023dc866ad4, it isn't until a few lines below that the stream is initially [created](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/1.16/gst/mpegtsmux/mpegtsmux.c#L979). So despite logging that a PCR stream had been selected and calling `tsmux_program_set_pcr_stream` it did not have any effect. Afterwards the first stream used for output is selected as the PCR stream [here](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/1.16/gst/mpegtsmux/mpegtsmux.c#L1435).
So it appears quite intentional to choose the first program stream as the PCR stream. However, at least for my use case, it is common not to know if the first buffer will be audio or video until after linking the pipeline. In this case the behavior of 1.16 where it selects the first stream buffer for output as the PCR stream is needed.
I do hesitate to create an MR to remove that logic as I only have a limited familiarity with the code. Perhaps it would be better instead to ensure that the SI structures are written before the first stream data?
Reproduction:
`gst-launch-1.0 audiotestsrc num-buffers=1 timestamp-offset=1 ! avenc_aac ! aacparse ! .sink_101 mpegtsmux name=mux ! tsdemux ! fakesink videotestsrc num-buffers=1 ! x264enc ! mux.sink_102`
In this case audio is selected as the PCR stream as it is linked first. Yet due to audio's 'timestamp-offset=1' the first video buffer is sent before the PAT/PMT. If you enable logging in the demuxer there will be a bunch of "PID 0x0066 Saw packet on a pid we don't handle" messages.https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1556mpegtsmux: Stream packets can be sent prior to first PAT/PMT2023-07-04T10:43:43ZAndrew Gallmpegtsmux: Stream packets can be sent prior to first PAT/PMTSince 1.18, by default the first stream created (based on the pad creation order) is considered the PCR stream. Yet this PCR stream may not be the first stream whose buffers' are muxed, which results in writing stream data prior to writi...Since 1.18, by default the first stream created (based on the pad creation order) is considered the PCR stream. Yet this PCR stream may not be the first stream whose buffers' are muxed, which results in writing stream data prior to writing the PAT/PMT.
In 1.16, despite the logic introduced by https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/commit/3f0463c43e2c61ba5509c9466dc2a023dc866ad4, it isn't until a few lines below that the stream is initially [created](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/1.16/gst/mpegtsmux/mpegtsmux.c#L979). So despite logging that a PCR stream had been selected and calling `tsmux_program_set_pcr_stream` it did not have any effect. Afterwards the first stream used for output is selected as the PCR stream [here](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/1.16/gst/mpegtsmux/mpegtsmux.c#L1435).
So it appears quite intentional to choose the first program stream as the PCR stream. However, at least for my use case, it is common not to know if the first buffer will be audio or video until after linking the pipeline. In this case the behavior of 1.16 where it selects the first stream buffer for output as the PCR stream is needed.
I do hesitate to create an MR to remove that logic as I only have a limited familiarity with the code. Perhaps it would be better instead to ensure that the SI structures are written before the first stream data?
Reproduction:
`gst-launch-1.0 audiotestsrc num-buffers=1 timestamp-offset=1 ! avenc_aac ! aacparse ! .sink_101 mpegtsmux name=mux ! tsdemux ! fakesink videotestsrc num-buffers=1 ! x264enc ! mux.sink_102`
In this case audio is selected as the PCR stream as it is linked first. Yet due to audio's 'timestamp-offset=1' the first video buffer is sent before the PAT/PMT. If you enable logging in the demuxer there will be a bunch of "PID 0x0066 Saw packet on a pid we don't handle" messages.https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/883gstbasesrc: get_size() -> FALSE still causes duration queries to be answered2021-09-24T13:26:17ZAlicia Boya Garcíagstbasesrc: get_size() -> FALSE still causes duration queries to be answeredThe default handler of duration queries in gstbasesrc returns TRUE if the subclass didn't set a size, returning -1 as the duration. This is the case even though when the duration format doesn't match, e.g. a TIME duration query is made, ...The default handler of duration queries in gstbasesrc returns TRUE if the subclass didn't set a size, returning -1 as the duration. This is the case even though when the duration format doesn't match, e.g. a TIME duration query is made, but the format of the basesrc is BYTES.
Examples of this happening:
* `filesrc location=<(cat anything)`
* `souphttpsrc` when the resource doesn't set a Content-Length
This breaks duration queries for qtdemux in cases like the above.
When receiving a duration query, qtdemux first forwards it upstream, and if it receives a success, it returns that value as the result. Only when that fails it will look at the duration encoded in the MP4 file and return that.https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/882Deadlock stopping alsa audiosink2021-04-08T08:14:31ZDoug NazarDeadlock stopping alsa audiosinkI've occasionally had my app deadlock. The main thread is waiting for the alsasink thread to finish but the alsasink thread is stuck in the loop at https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/ext/alsa/gstalsas...I've occasionally had my app deadlock. The main thread is waiting for the alsasink thread to finish but the alsasink thread is stuck in the loop at https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/ext/alsa/gstalsasink.c#L1069 and is not making any progress since alsa is paused and `snd_pcm_writei()` returns 0 if it's in `SND_PCM_STATE_PAUSED`.
I've been staring at the logic and I can't find anything that synchronizes between the sink thread and pausing. If the sink thread passes the `gst_audio_ring_buffer_prepare_read()` check in https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/blob/master/gst-libs/gst/audio/gstaudiosink.c#L236, but another thread calls `gst_alsasink_pause()` before `snd_pcm_writei()` it can get into an infinite loop. The locking that happens at the ring buffer level doesn't impact the locking at the gstalsasink.
I've debugged this with 1.18.3 but a quick scan of 1.18.4 & HEAD doesn't show anything that I think will impact this.
I'm currently running with the following patch, which doesn't trigger during normal usage, but does allow it to recover. I'm not sure if it's the correct way to solve this issue. Perhaps the audioring should ensure it's out of the write loop before calling the next layer on pause.
```
diff -ru gst-plugins-base-1.18.3/ext/alsa/gstalsasink.c /var/tmp/portage/media-libs/gst-plugins-base-1.18.3/work/gst-plugins-base-1.18.3/ext/alsa/gstalsasink.c
--- gst-plugins-base-1.18.3/ext/alsa/gstalsasink.c 2021-01-13 16:07:13.997253000 -0500
+++ /var/tmp/portage/media-libs/gst-plugins-base-1.18.3/work/gst-plugins-base-1.18.3/ext/alsa/gstalsasink.c 2021-03-13 18:11:26.729060020 -0500
@@ -1068,6 +1068,14 @@
goto write_error;
}
continue;
+ } else if (err == 0 && alsa->hw_support_pause) {
+ /* We might be paused, if so, just bail */
+ snd_pcm_state_t state;
+
+ state = snd_pcm_state (alsa->handle);
+ GST_WARNING_OBJECT (asink, "TMPDBG: pause check. state = %i", state);
+ if (state == SND_PCM_STATE_PAUSED)
+ break;
}
ptr += snd_pcm_frames_to_bytes (alsa->handle, err);
diff -ru gst-plugins-base-1.18.3/gst-libs/gst/audio/gstaudiosink.c /var/tmp/portage/media-libs/gst-plugins-base-1.18.3/work/gst-plugins-base-1.18.3/gst-libs/gst/audio/gstaudiosink.c
--- gst-plugins-base-1.18.3/gst-libs/gst/audio/gstaudiosink.c 2021-01-13 16:07:14.021253000 -0500
+++ /var/tmp/portage/media-libs/gst-plugins-base-1.18.3/work/gst-plugins-base-1.18.3/gst-libs/gst/audio/gstaudiosink.c 2021-03-15 15:03:21.496837689 -0400
@@ -254,7 +254,12 @@
GST_DEBUG_FUNCPTR_NAME (writefunc),
(errno > 1 ? g_strerror (errno) : "unknown"), left, written);
break;
+ } else if (written == 0 && G_UNLIKELY (g_atomic_int_get (&buf->state) !=
+ GST_AUDIO_RING_BUFFER_STATE_STARTED)) {
+ GST_WARNING_OBJECT (sink, "TMPDBG: started check.");
+ break;
}
+
left -= written;
readptr += written;
} while (left > 0);
```https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1555wasapisink: PLAYING -> READY -> PLAYING state change fails2021-07-12T14:27:36ZJakub Jankůwasapisink: PLAYING -> READY -> PLAYING state change failsWhen the sink goes from PLAYING to READY, it cannot return back to PLAYING afterwards. The following error is printed:
`wasapi gstwasapiutil.c:901:gst_wasapi_util_initialize_audioclient:<wasapisink0> IAudioClient::Initialize failed (888...When the sink goes from PLAYING to READY, it cannot return back to PLAYING afterwards. The following error is printed:
`wasapi gstwasapiutil.c:901:gst_wasapi_util_initialize_audioclient:<wasapisink0> IAudioClient::Initialize failed (88890002): AUDCLNT_E_ALREADY_INITIALIZED`
I'm attaching a very simple program that I used for testing, if it's any worth.
[wasapi-test.c](/uploads/824116652455dc0104c13a43e0ce7f80/wasapi-test.c)https://gitlab.freedesktop.org/gstreamer/cerbero/-/issues/3201.18.4: MacOS prebuilt binaries contain duplicate copies of libraries after i...2024-02-06T13:05:39Zw3sip1.18.4: MacOS prebuilt binaries contain duplicate copies of libraries after installFor example:
```
$ ls -la /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml*
-rwxr-xr-x 1 root wheel 1647816 Mar 15 22:18 /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml2.2.dylib
-rw-r--r-- 1 root whee...For example:
```
$ ls -la /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml*
-rwxr-xr-x 1 root wheel 1647816 Mar 15 22:18 /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml2.2.dylib
-rw-r--r-- 1 root wheel 7300720 Mar 15 22:18 /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml2.a
-rwxr-xr-x 1 root wheel 1647816 Mar 15 22:18 /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml2.dylib
-rwxr-xr-x 1 root wheel 1098 Mar 15 22:18 /Library/Frameworks/GStreamer.framework/Versions/1.0/lib/libxml2.la
```
Both versions of libxml2.2.dylib and libxml2.dylib are hard copies of the library. This inflates MacOSX framework size quite a bit.