GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2023-05-09T17:02:47Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2536rtpmanager/rtsession: race conditions leading to critical warnings2023-05-09T17:02:47ZFrançois Laignelrtpmanager/rtsession: race conditions leading to critical warnings## Conditions
While testing the [implementation for insertable streams](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1176) in `webrtcsink` &
`webrtcsrc`, I encountered multiple critical warnings, which turned...## Conditions
While testing the [implementation for insertable streams](https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1176) in `webrtcsink` &
`webrtcsrc`, I encountered multiple critical warnings, which turned out to
result from two race conditions in `rtpsession`. Both race conditions produce:
> GLib-CRITICAL: g_hash_table_foreach:
> assertion 'version == hash_table->version' failed
In its simplest form, the test consists in 2 pipelines and a Signalling server:
* pipelines_sink: audiotestsrc ! webrtcsink
* pipelines_src: webrtcsrc ! appsrc
1. Set `pipelines_sink` to `Playing`.
2. The Signalling server delivers the `producer_id`.
3. Initialize `pipelines_src` to establish a session with `producer_id`.
4. Set `pipelines_src` to `Playing`.
5. Wait for a buffer to be received by the `appsrc`.
6. Set `pipelines_src` to `Null`.
7. Set `pipelines_sink` to `Null`.
## First race condition
First race condition happens in the following sequence:
* `webrtcsink` runs a task to periodically retrieve statistics from `webrtcbin`.
This transitively ends up executing `rtp_session_create_stats`.
* `pipelines_sink` is set to `Null`.
* In `Paused` to `Ready`, `gst_rtp_session_change_state()` calls
`rtp_session_reset()`.
* The assertion failure occurs when `rtp_session_reset` is called while
`rtp_session_create_stats` is executing.
This is because `rtp_session_create_stats` acquires the lock on `session` prior
to calling `g_hash_table_foreach`, but `rtp_session_reset` doesn't acquire the
lock before calling `g_hash_table_remove_all`.
Acquiring the lock in `rtp_session_reset` fixes the issue and was implemented in
[this branch](https://gitlab.freedesktop.org/fengalin/gstreamer/-/commits/rtpsession-hash_table-criticals). I can open an MR if this seems acceptable.
## Second race condition
Second race condition happens right after the first payload is received:
* `rtp_session_on_timeout` acquires the lock on `session` and proceeds with its
processing.
* `rtp_session_process_rtcp` is called (debug log : received RTCP packet) and
attempts to acquire the lock on `session`, which is still held by
`rtp_session_on_timeout`.
* as part of an hash table iterator, `rtp_session_on_timeout` transitively
invokes `source_caps` which releases the lock on `session` so as to call
`session->callbacks.caps`.
* Since `rtp_session_process_rtcp` was waiting for the lock to be released, it
succeeds in acquiring it and proceeds with `rtp_session_process_rr` which
transitively calls `g_hash_table_insert` via `add_source`.
* After `source_caps` re-acquires the lock and gives the control flow back to
`rtp_session_on_timeout`, the hash table iterator is changed, resulting in the
assertion failure.
I'm not quite sure how to fix this without risking deadlocks.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2535iqa compilation fails on dssim2023-05-02T11:25:19Zintractabilisiqa compilation fails on dssimGStreamer 1.22. If `-Dgpl=enabled` is specified, compilation of GStreamer fails with
```
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c: In function ‘do_dssim’:
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c:198:3: error:...GStreamer 1.22. If `-Dgpl=enabled` is specified, compilation of GStreamer fails with
```
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c: In function ‘do_dssim’:
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c:198:3: error: unknown type name ‘dssim_attr’
198 | dssim_attr *attr;
| ^~~~~~~~~~
```
DSSIM is installed following [these](https://github.com/kornelski/dssim#usage-from-c) instructions.https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1774iqa compilation fails on dssim2023-05-01T17:55:32Zintractabilisiqa compilation fails on dssimGStreamer 1.22. If `-Dgpl=enabled` is specified, compilation of GStreamer fails with
```
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c: In function ‘do_dssim’:
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c:198:3: error:...GStreamer 1.22. If `-Dgpl=enabled` is specified, compilation of GStreamer fails with
```
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c: In function ‘do_dssim’:
../gstreamer/subprojects/gst-plugins-bad/ext/iqa/iqa.c:198:3: error: unknown type name ‘dssim_attr’
198 | dssim_attr *attr;
| ^~~~~~~~~~
```
DSSIM is installed following [these](https://github.com/kornelski/dssim#usage-from-c) instructions.https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/1013I want to stream RTSP stream data to AWS Kinesis Video Streaming using networ...2023-05-02T15:13:29ZryupimI want to stream RTSP stream data to AWS Kinesis Video Streaming using network camera "XNO-6120R"I want to stream RTSP stream data to AWS Kinesis Video Streaming using network camera "XNO-6120R".
A simple configuration is as follows.
Camera (XNO-6120R) --> Gstreamer (Ubuntu22.04 container) --> AWS KVS
After downloading the GStream...I want to stream RTSP stream data to AWS Kinesis Video Streaming using network camera "XNO-6120R".
A simple configuration is as follows.
Camera (XNO-6120R) --> Gstreamer (Ubuntu22.04 container) --> AWS KVS
After downloading the GStreamer package and building the AWS KVS SDK, I'm trying to transfer rtsp data with the following command, but an error is output.
- command1
`./kvs_gstraemer_sample $STREAM_NAME $RTSP_URL 2>&1 | tee /log/rtsp2kvs-kvsSample.log`
Output
<details><summary>Click to expand</summary>
[INFO ] [01-05-2023 07:28:37:713.777 GMT] Using region: ap-northeast-1
[INFO ] [01-05-2023 07:28:37:713.799 GMT] Using aws credentials for Kinesis Video Streams
[INFO ] [01-05-2023 07:28:37:713.802 GMT] No session token was detected.
[INFO ] [01-05-2023 07:28:37:714.361 GMT] createKinesisVideoClient(): Creating Kinesis Video Client
[INFO ] [01-05-2023 07:28:37:714.385 GMT] heapInitialize(): Initializing native heap with limit size 134217728, spill ratio 0% and flags 0x00000001
[INFO ] [01-05-2023 07:28:37:714.388 GMT] heapInitialize(): Creating AIV heap.
[INFO ] [01-05-2023 07:28:37:714.397 GMT] heapInitialize(): Heap is initialized OK
[DEBUG] [01-05-2023 07:28:37:714.449 GMT] getSecurityTokenHandler invoked
[DEBUG] [01-05-2023 07:28:37:714.463 GMT] Refreshing credentials. Force refreshing: 0 Now time is: 1682926117714454898 Expiration: 0
[INFO ] [01-05-2023 07:28:37:714.467 GMT] New credentials expiration is 1682929717
[INFO ] [01-05-2023 07:28:37:714.477 GMT] createDeviceResultEvent(): Create device result event.
[DEBUG] [01-05-2023 07:28:37:714.482 GMT] clientReadyHandler invoked
[DEBUG] [01-05-2023 07:28:37:714.497 GMT] Client is ready
[INFO ] [01-05-2023 07:28:37:714.526 GMT] Creating Kinesis Video Stream stream-test-20230424
[INFO ] [01-05-2023 07:28:37:714.538 GMT] createKinesisVideoStream(): Creating Kinesis Video Stream.
[INFO ] [01-05-2023 07:28:37:714.542 GMT] logStreamInfo(): SDK version: e99e7e94d897f309c773cb3bb677cab5c9b5f587
[DEBUG] [01-05-2023 07:28:37:714.545 GMT] logStreamInfo(): Kinesis Video Stream Info
[DEBUG] [01-05-2023 07:28:37:714.547 GMT] logStreamInfo(): Stream name: stream-test-20230424
[DEBUG] [01-05-2023 07:28:37:714.550 GMT] logStreamInfo(): Streaming type: STREAMING_TYPE_REALTIME
[DEBUG] [01-05-2023 07:28:37:714.552 GMT] logStreamInfo(): Content type: video/h264
[DEBUG] [01-05-2023 07:28:37:714.555 GMT] logStreamInfo(): Max latency (100ns): 600000000
[DEBUG] [01-05-2023 07:28:37:714.558 GMT] logStreamInfo(): Fragment duration (100ns): 20000000
[DEBUG] [01-05-2023 07:28:37:714.565 GMT] logStreamInfo(): Key frame fragmentation: Yes
[DEBUG] [01-05-2023 07:28:37:714.567 GMT] logStreamInfo(): Use frame timecode: Yes
[DEBUG] [01-05-2023 07:28:37:714.570 GMT] logStreamInfo(): Absolute frame timecode: Yes
[DEBUG] [01-05-2023 07:28:37:714.572 GMT] logStreamInfo(): Nal adaptation flags: 0
[DEBUG] [01-05-2023 07:28:37:714.575 GMT] logStreamInfo(): Average bandwith (bps): 4194304
[DEBUG] [01-05-2023 07:28:37:714.577 GMT] logStreamInfo(): Framerate: 25
[DEBUG] [01-05-2023 07:28:37:714.580 GMT] logStreamInfo(): Buffer duration (100ns): 1200000000
[DEBUG] [01-05-2023 07:28:37:714.582 GMT] logStreamInfo(): Replay duration (100ns): 400000000
[DEBUG] [01-05-2023 07:28:37:714.585 GMT] logStreamInfo(): Connection Staleness duration (100ns): 600000000
[DEBUG] [01-05-2023 07:28:37:714.587 GMT] logStreamInfo(): Store Pressure Policy: 1
[DEBUG] [01-05-2023 07:28:37:714.590 GMT] logStreamInfo(): View Overflow Policy: 1
[DEBUG] [01-05-2023 07:28:37:714.592 GMT] logStreamInfo(): Segment UUID: NULL
[DEBUG] [01-05-2023 07:28:37:714.595 GMT] logStreamInfo(): Frame ordering mode: 0
[DEBUG] [01-05-2023 07:28:37:714.597 GMT] logStreamInfo(): Track list
[DEBUG] [01-05-2023 07:28:37:714.600 GMT] logStreamInfo(): Track id: 1
[DEBUG] [01-05-2023 07:28:37:714.602 GMT] logStreamInfo(): Track name: kinesis_video
[DEBUG] [01-05-2023 07:28:37:714.605 GMT] logStreamInfo(): Codec id: V_MPEG4/ISO/AVC
[DEBUG] [01-05-2023 07:28:37:714.607 GMT] logStreamInfo(): Track type: TRACK_INFO_TYPE_VIDEO
[DEBUG] [01-05-2023 07:28:37:714.610 GMT] logStreamInfo(): Track cpd: NULL
[INFO ] [01-05-2023 07:28:37:929.061 GMT] writeHeaderCallback(): RequestId: ce6bbea4-ff93-464f-a324-6c04b757cc26
[DEBUG] [01-05-2023 07:28:37:929.103 GMT] describeStreamCurlHandler(): DescribeStream API response: {"StreamInfo":{"CreationTime":1.682322102808E9,"DataRetentionInHours":24,"DeviceName":null,"IngestionConfiguration":null,"KmsKeyId":"xxx","MediaType":null,"Status":"ACTIVE","Stre
(kvs_gstreamer_sample:1114): GStreamer-WARNING **: 07:28:38.165: External plugin loader failed. This most likely means that the plugin loader helper binary was not found or could not be run. You might need to set the GST_PLUGIN_SCANNER environment variable if your setup is unusual. This should normally not be required though.
amARN":"xxx","StreamName":"stream-test-20230424","Version":"TZsORArcf5fLv5mCq0IR"}}
[INFO ] [01-05-2023 07:28:37:929.770 GMT] describeStreamResultEvent(): Describe stream result event.
[WARN ] [01-05-2023 07:28:37:929.791 GMT] describeStreamResult(): Retention period returned from the DescribeStream call doesn't match the one specified in the StreamInfo
[WARN ] [01-05-2023 07:28:37:929.793 GMT] describeStreamResult(): Content type returned from the DescribeStream call doesn't match the one specified in the StreamInfo
[INFO ] [01-05-2023 07:28:38:157.736 GMT] writeHeaderCallback(): RequestId: 9a4a52f3-f75a-46f1-8795-bb1b5ff51522
[DEBUG] [01-05-2023 07:28:38:157.770 GMT] getStreamingEndpointCurlHandler(): GetStreamingEndpoint API response: {"DataEndpoint":"xxx"}
[INFO ] [01-05-2023 07:28:38:158.180 GMT] getStreamingEndpointResultEvent(): Get streaming endpoint result event.
[DEBUG] [01-05-2023 07:28:38:158.197 GMT] getStreamingTokenHandler invoked
[DEBUG] [01-05-2023 07:28:38:158.205 GMT] Refreshing credentials. Force refreshing: 1 Now time is: 1682926118158203675 Expiration: 1682929717
[INFO ] [01-05-2023 07:28:38:158.209 GMT] New credentials expiration is 1682929718
[INFO ] [01-05-2023 07:28:38:158.217 GMT] getStreamingTokenResultEvent(): Get streaming token result event.
[DEBUG] [01-05-2023 07:28:38:158.225 GMT] streamReadyHandler invoked
[DEBUG] [01-05-2023 07:28:38:158.362 GMT] Stream is ready
[INFO ] [01-05-2023 07:28:38:166.388 GMT] Streaming from rtsp source
New pad found: recv_rtp_src_0_3929480219_26
[INFO ] [01-05-2023 07:28:38:253.416 GMT] Pad link failed
New pad found: recv_rtp_src_1_45886901_107
</details>
GST_DEBUG=6 [rtsp2kvs-kvsSample_level6.log](/uploads/2b550d73fb27a724665c5a85be2d7349/rtsp2kvs-kvsSample_level6.log)
- command2
`gst-launch-1.0 rtspsrc location=$RTSP_URL protocols=udp short-header=TRUE ! \
rtph264depay ! \
h264parse ! \
kvssink stream-name=$STREAM_NAME 2>&1 | tee /log/rtsp2kvs-gstLaunch.log`
Output
<details><summary>Click to expand </summary>
(gst-launch-1.0:1065): GStreamer-WARNING **: 07:27:45.151: External plugin loader failed. This most likely means that the plugin loader helper binary was not found or could not be run. You might need to set the GST_PLUGIN_SCANNER environment variable if your setup is unusual. This should normally not be required though.
Setting pipeline to PAUSED ...
[INFO ] [01-05-2023 07:27:45:161.642 GMT] createKinesisVideoClient(): Creating Kinesis Video Client
[INFO ] [01-05-2023 07:27:45:161.692 GMT] heapInitialize(): Initializing native heap with limit size 134217728, spill ratio 0% and flags 0x00000001
[INFO ] [01-05-2023 07:27:45:161.697 GMT] heapInitialize(): Creating AIV heap.
[INFO ] [01-05-2023 07:27:45:161.712 GMT] heapInitialize(): Heap is initialized OK
[DEBUG] [01-05-2023 07:27:45:161.777 GMT] getSecurityTokenHandler invoked
[DEBUG] [01-05-2023 07:27:45:161.787 GMT] Refreshing credentials. Force refreshing: 0 Now time is: 1682926065161783781 Expiration: 0
[INFO ] [01-05-2023 07:27:45:161.796 GMT] createDeviceResultEvent(): Create device result event.
[DEBUG] [01-05-2023 07:27:45:161.803 GMT] clientReadyHandler invoked
[INFO ] [01-05-2023 07:27:45:161.826 GMT] try creating stream
[INFO ] [01-05-2023 07:27:45:161.841 GMT] Creating Kinesis Video Stream stream-test-20230424
[INFO ] [01-05-2023 07:27:45:161.847 GMT] createKinesisVideoStream(): Creating Kinesis Video Stream.
[INFO ] [01-05-2023 07:27:45:161.860 GMT] logStreamInfo(): SDK version: e99e7e94d897f309c773cb3bb677cab5c9b5f587
[DEBUG] [01-05-2023 07:27:45:161.864 GMT] logStreamInfo(): Kinesis Video Stream Info
[DEBUG] [01-05-2023 07:27:45:161.868 GMT] logStreamInfo(): Stream name: stream-test-20230424
[DEBUG] [01-05-2023 07:27:45:161.872 GMT] logStreamInfo(): Streaming type: STREAMING_TYPE_REALTIME
[DEBUG] [01-05-2023 07:27:45:161.876 GMT] logStreamInfo(): Content type: video/h264
[DEBUG] [01-05-2023 07:27:45:161.880 GMT] logStreamInfo(): Max latency (100ns): 600000000
[DEBUG] [01-05-2023 07:27:45:161.884 GMT] logStreamInfo(): Fragment duration (100ns): 20000000
[DEBUG] [01-05-2023 07:27:45:161.888 GMT] logStreamInfo(): Key frame fragmentation: Yes
[DEBUG] [01-05-2023 07:27:45:161.893 GMT] logStreamInfo(): Use frame timecode: Yes
[DEBUG] [01-05-2023 07:27:45:161.896 GMT] logStreamInfo(): Absolute frame timecode: Yes
[DEBUG] [01-05-2023 07:27:45:161.900 GMT] logStreamInfo(): Nal adaptation flags: 0
[DEBUG] [01-05-2023 07:27:45:161.904 GMT] logStreamInfo(): Average bandwith (bps): 4194304
[DEBUG] [01-05-2023 07:27:45:161.908 GMT] logStreamInfo(): Framerate: 25
[DEBUG] [01-05-2023 07:27:45:161.912 GMT] logStreamInfo(): Buffer duration (100ns): 1200000000
[DEBUG] [01-05-2023 07:27:45:161.916 GMT] logStreamInfo(): Replay duration (100ns): 400000000
[DEBUG] [01-05-2023 07:27:45:161.919 GMT] logStreamInfo(): Connection Staleness duration (100ns): 600000000
[DEBUG] [01-05-2023 07:27:45:161.924 GMT] logStreamInfo(): Store Pressure Policy: 1
[DEBUG] [01-05-2023 07:27:45:161.928 GMT] logStreamInfo(): View Overflow Policy: 1
[DEBUG] [01-05-2023 07:27:45:161.932 GMT] logStreamInfo(): Segment UUID: NULL
[DEBUG] [01-05-2023 07:27:45:161.936 GMT] logStreamInfo(): Frame ordering mode: 0
[DEBUG] [01-05-2023 07:27:45:161.940 GMT] logStreamInfo(): Track list
[DEBUG] [01-05-2023 07:27:45:161.944 GMT] logStreamInfo(): Track id: 1
[DEBUG] [01-05-2023 07:27:45:161.948 GMT] logStreamInfo(): Track name: kinesis_video
[DEBUG] [01-05-2023 07:27:45:161.952 GMT] logStreamInfo(): Codec id: V_MPEG4/ISO/AVC
[DEBUG] [01-05-2023 07:27:45:161.955 GMT] logStreamInfo(): Track type: TRACK_INFO_TYPE_VIDEO
[DEBUG] [01-05-2023 07:27:45:161.959 GMT] logStreamInfo(): Track cpd: NULL
[INFO ] [01-05-2023 07:27:45:372.623 GMT] writeHeaderCallback(): RequestId: 1a5b505b-9c0a-467b-91b8-bf151fe12615
[DEBUG] [01-05-2023 07:27:45:372.666 GMT] describeStreamCurlHandler(): DescribeStream API response: {"StreamInfo":{"CreationTime":1.682322102808E9,"DataRetentionInHours":24,"DeviceName":null,"IngestionConfiguration":null,"KmsKeyId":"arn:xxx","MediaType":null,"Status":"ACTIVE","StreamARN":"xxx","StreamName":"stream-test-20230424","Version":"TZsORArcf5fLv5mCq0IR"}}
[INFO ] [01-05-2023 07:27:45:373.222 GMT] describeStreamResultEvent(): Describe stream result event.
[WARN ] [01-05-2023 07:27:45:373.234 GMT] describeStreamResult(): Retention period returned from the DescribeStream call doesn't match the one specified in the StreamInfo
[WARN ] [01-05-2023 07:27:45:373.242 GMT] describeStreamResult(): Content type returned from the DescribeStream call doesn't match the one specified in the StreamInfo
[INFO ] [01-05-2023 07:27:45:563.142 GMT] writeHeaderCallback(): RequestId: 8c933c7d-832e-4308-bd7e-a691672d30ba
[DEBUG] [01-05-2023 07:27:45:563.179 GMT] getStreamingEndpointCurlHandler(): GetStreamingEndpoint API response: {"DataEndpoint":"xxx"}
[INFO ] [01-05-2023 07:27:45:563.701 GMT] getStreamingEndpointResultEvent(): Get streaming endpoint result event.
[DEBUG] [01-05-2023 07:27:45:563.719 GMT] getStreamingTokenHandler invoked
[DEBUG] [01-05-2023 07:27:45:563.727 GMT] Refreshing credentials. Force refreshing: 1 Now time is: 1682926065563724704 Expiration: 18446744073709551615
[INFO ] [01-05-2023 07:27:45:563.739 GMT] getStreamingTokenResultEvent(): Get streaming token result event.
[DEBUG] [01-05-2023 07:27:45:563.750 GMT] streamReadyHandler invoked
Stream is ready
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Pipeline is PREROLLED ...
Prerolled, waiting for progress to finish...
Progress: (connect) Connecting to rtsp://192.168.0.100:554/0/onvif/profile1/media.smp
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Redistribute latency...
Redistribute latency...
Progress: (request) Sending PLAY request
Redistribute latency...
Redistribute latency...
Progress: (request) Sent PLAY request
Redistribute latency...
Redistribute latency...
Redistribute latency...
WARNING: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Delayed linking failed.
Additional debug info:
gst/parse/grammar.y(540): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
failed delayed linking some pad of GstRTSPSrc named rtspsrc0 to some pad of GstRtpH264Depay named rtph264depay0
Redistribute latency...
</details>
GST_DEBUG=6 [rtsp2kvs-gstLaunch_level6.log](/uploads/93edab688ae83bd5b3cbddf2b60e6d2c/rtsp2kvs-gstLaunch_level6.log)
By the way, it correctly works on other systems that output RTSP stream data.
ex)
- rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4
- Live-Reporter ([link](https://apps.apple.com/us/app/live-reporter-live-camera/id996017825))
Please help me.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/348whipsink/rtpav1pay: Troubles with AV1 video stream, a backend using 'pion/web...2023-06-02T09:14:30ZKristian Ollikainenwhipsink/rtpav1pay: Troubles with AV1 video stream, a backend using 'pion/webrtc' and browser clientsFor some reasons, AV1 doesn't play well when it comes to this setup:
`GStreamer (rtpav1pay + whipsink) -> Go-based backend using 'pion/webrtc' -> Client browsers (such as Chrome)`
Issue is that the resulting stream doesn't ever play, p...For some reasons, AV1 doesn't play well when it comes to this setup:
`GStreamer (rtpav1pay + whipsink) -> Go-based backend using 'pion/webrtc' -> Client browsers (such as Chrome)`
Issue is that the resulting stream doesn't ever play, packets are not lost, yet Picture-Loss-Indicator counter keeps rising in `chrome://webrtc-internals`. Only signs of hope I've seen is by having bitrate below MTU (~1500) which occassionally allows a frame to be decoded and shown, perhaps even some actual video.
I've gone back and worth with a developer from Pion and the issues seem to be pointing here, using browser to send and receive AV1 stream is working perfectly fine (using this to test with some minor tweaks to force AV1: https://github.com/Glimesh/broadcast-box).
It's late for me, so any deeper information I can provide later, please just tell what's needed :pray:https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2534macOS only: gstvideometa.c "Stride of plane 0 is different... alignment"2023-05-01T05:07:19ZF. DuncanhmacOS only: gstvideometa.c "Stride of plane 0 is different... alignment"In macOS 13.3 (and 10.13.x) GST_DEBUG=2 reveals that EVERY video frame (30 fps h264, streamed from Apple AirPlay) triggers a warning
```
0:17:32.989417000 5843 0x1419a1ad0 WARN videometa gstvideometa.c:416:gst_video_me...In macOS 13.3 (and 10.13.x) GST_DEBUG=2 reveals that EVERY video frame (30 fps h264, streamed from Apple AirPlay) triggers a warning
```
0:17:32.989417000 5843 0x1419a1ad0 WARN videometa gstvideometa.c:416:gst_video_meta_validate_alignment: Stride of plane 0 defined in meta (1472) is different from the one computed from the alignment (1440)
0:17:33.024003000 5843 0x1419a1ad0 WARN videometa gstvideometa.c:416:gst_video_meta_validate_alignment: Stride of plane 0 defined in meta (1472) is different from the one computed from the alignment (1440)
0:17:33.053544000 5843 0x1419a1ad0 WARN videometa gstvideometa.c:416:gst_video_meta_validate_alignment: Stride of plane 0 defined in meta (1472) is different from the one computed from the alignment (1440)
0:17:33.087698000 5843 0x1419a1ad0 WARN videometa gstvideometa.c:416:gst_video_meta_validate_alignment: Stride of plane 0 defined in meta (1472) is different from the one computed from the alignment (1440)
```
This error is current and is present already in 1.20.5
I'll see if its occurs in any older gstreamer I can install.
This is only in MacOS, both M2 and intel.
* EDIT: is this just "spam" that should be suppressed, or is there a bug in computing alignment (on macOS only?) that should be tracked down (I have a self-compiled 1.22.2 on MacOS 13.3 that I can investigate with).
The h264video plays OK, it seems.https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/347ndisrc howto handle empty stream2023-05-02T15:07:11ZFriismestemacherndisrc howto handle empty streamhi there,
Im trying to set up a gstreamer pipeline to transcode a NDI source.
My NDI stream is created by a software, that sets up a NDI source, but is not always streaming into it.
While something gets streamed my pipline works fine,...hi there,
Im trying to set up a gstreamer pipeline to transcode a NDI source.
My NDI stream is created by a software, that sets up a NDI source, but is not always streaming into it.
While something gets streamed my pipline works fine, but when nothing is sent (but the NDI source still exist) gstreamer would exit with: Got EOS from element "pipeline0". Is there a way to keep the pipeline alive while there are "black holes"?https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/346ci: missing dtls elements in Windows images2023-05-21T19:43:22ZFrançois Laignelci: missing dtls elements in Windows imagesdtls elements are not included in Windows CI images, causing a new WebRTC test to fail on this platform.
Isolated job: https://gitlab.freedesktop.org/fengalin/gst-plugins-rs/-/jobs/40919231dtls elements are not included in Windows CI images, causing a new WebRTC test to fail on this platform.
Isolated job: https://gitlab.freedesktop.org/fengalin/gst-plugins-rs/-/jobs/40919231Jordan PetridіsJordan Petridіshttps://gitlab.freedesktop.org/gstreamer/gstreamer-vaapi/-/issues/342vaapih265enc internal data stream error on Ubuntu 222023-05-11T01:21:39ZXu Huvaapih265enc internal data stream error on Ubuntu 22`gst-launch-1.0 videotestsrc ! vaapih265enc ! fakesink -e` gives errors like
```
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'vaapiencodeh265-0': gst.vaapi.Display=context, gst.vaapi.Display=(GstVaa...`gst-launch-1.0 videotestsrc ! vaapih265enc ! fakesink -e` gives errors like
```
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'vaapiencodeh265-0': gst.vaapi.Display=context, gst.vaapi.Display=(GstVaapiDisplay)"\(GstVaapiDisplayDRM\)\ vaapidisplaydrm1", gst.vaapi.Display.GObject=(GstObject)"\(GstVaapiDisplayDRM\)\ vaapidisplaydrm1";
ERROR: from element /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0:
streaming stopped, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
```
A suspicious log is `gstvaapiutils.c:131:vaapi_check_status: vaRenderPicture(): invalid VABufferID`.
`gst-launch-1.0 videotestsrc ! vaapih264enc ! fakesink -e` runs normally without any issue.
CPU: 12th Gen Intel(R) Core(TM) i5-12500
OS: ubuntu 22
```
cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.2 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.2 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy
```
GStreamer and vaapi are installed from apt.
```
sudo apt list --installed | grep gstreamer
WARNING: apt does not have a stable CLI interface. Use with caution in scripts.
gir1.2-gstreamer-1.0/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed,automatic]
gstreamer1.0-alsa/jammy,now 1.20.1-1 amd64 [installed]
gstreamer1.0-clutter-3.0/jammy-updates,now 3.0.27-2ubuntu1 amd64 [installed,automatic]
gstreamer1.0-gl/jammy,now 1.20.1-1 amd64 [installed]
gstreamer1.0-gtk3/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-libav/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-packagekit/jammy,now 1.2.5-2ubuntu2 amd64 [installed,automatic]
gstreamer1.0-pipewire/jammy-updates,now 0.3.48-1ubuntu3 amd64 [installed,automatic]
gstreamer1.0-plugins-bad/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-plugins-base-apps/jammy,now 1.20.1-1 amd64 [installed,automatic]
gstreamer1.0-plugins-base/jammy,now 1.20.1-1 amd64 [installed]
gstreamer1.0-plugins-good/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-plugins-ugly/jammy,now 1.20.1-1 amd64 [installed]
gstreamer1.0-pulseaudio/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-qt5/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-tools/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
gstreamer1.0-vaapi/jammy-updates,now 1.20.1-1ubuntu1 amd64 [installed]
gstreamer1.0-x/jammy,now 1.20.1-1 amd64 [installed]
libgstreamer-gl1.0-0/jammy,now 1.20.1-1 amd64 [installed,automatic]
libgstreamer-opencv1.0-0/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed,automatic]
libgstreamer-plugins-bad1.0-0/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed,automatic]
libgstreamer-plugins-bad1.0-dev/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
libgstreamer-plugins-base1.0-0/jammy,now 1.20.1-1 amd64 [installed,automatic]
libgstreamer-plugins-base1.0-dev/jammy,now 1.20.1-1 amd64 [installed]
libgstreamer-plugins-good1.0-0/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed,automatic]
libgstreamer-plugins-good1.0-dev/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed,automatic]
libgstreamer1.0-0/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
libgstreamer1.0-dev/jammy-updates,now 1.20.3-0ubuntu1 amd64 [installed]
```
vainfo output:
```
vainfo
error: can't connect to X server!
libva info: VA-API version 1.19.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_14
libva info: va_openDriver() returns 0
vainfo: VA-API version: 1.19 (libva 2.15.0)
vainfo: Driver version: Intel iHD driver for Intel(R) Gen Graphics - 22.3.1 ()
vainfo: Supported profile and entrypoints
VAProfileNone : VAEntrypointVideoProc
VAProfileNone : VAEntrypointStats
VAProfileMPEG2Simple : VAEntrypointVLD
VAProfileMPEG2Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointVLD
VAProfileH264Main : VAEntrypointEncSliceLP
VAProfileH264High : VAEntrypointVLD
VAProfileH264High : VAEntrypointEncSliceLP
VAProfileJPEGBaseline : VAEntrypointVLD
VAProfileJPEGBaseline : VAEntrypointEncPicture
VAProfileH264ConstrainedBaseline: VAEntrypointVLD
VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP
VAProfileHEVCMain : VAEntrypointVLD
VAProfileHEVCMain : VAEntrypointEncSliceLP
VAProfileHEVCMain10 : VAEntrypointVLD
VAProfileHEVCMain10 : VAEntrypointEncSliceLP
VAProfileVP9Profile0 : VAEntrypointVLD
VAProfileVP9Profile0 : VAEntrypointEncSliceLP
VAProfileVP9Profile1 : VAEntrypointVLD
VAProfileVP9Profile1 : VAEntrypointEncSliceLP
VAProfileVP9Profile2 : VAEntrypointVLD
VAProfileVP9Profile2 : VAEntrypointEncSliceLP
VAProfileVP9Profile3 : VAEntrypointVLD
VAProfileVP9Profile3 : VAEntrypointEncSliceLP
VAProfileHEVCMain12 : VAEntrypointVLD
VAProfileHEVCMain422_10 : VAEntrypointVLD
VAProfileHEVCMain422_12 : VAEntrypointVLD
VAProfileHEVCMain444 : VAEntrypointVLD
VAProfileHEVCMain444 : VAEntrypointEncSliceLP
VAProfileHEVCMain444_10 : VAEntrypointVLD
VAProfileHEVCMain444_10 : VAEntrypointEncSliceLP
VAProfileHEVCMain444_12 : VAEntrypointVLD
VAProfileHEVCSccMain : VAEntrypointVLD
VAProfileHEVCSccMain : VAEntrypointEncSliceLP
VAProfileHEVCSccMain10 : VAEntrypointVLD
VAProfileHEVCSccMain10 : VAEntrypointEncSliceLP
VAProfileHEVCSccMain444 : VAEntrypointVLD
VAProfileHEVCSccMain444 : VAEntrypointEncSliceLP
VAProfileAV1Profile0 : VAEntrypointVLD
VAProfileHEVCSccMain444_10 : VAEntrypointVLD
VAProfileHEVCSccMain444_10 : VAEntrypointEncSliceLP
```
GStreamer verbose logs are attached as a text file. [gstreamer_vaapi_debug_logs.txt](/uploads/21c5f145e93eeb6e93a93481cddb6124/gstreamer_vaapi_debug_logs.txt)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2532v4l2sink: incorrectly reports "bytes_used" in the queued v4l2-buffer2023-05-02T11:04:03Zumläutev4l2sink: incorrectly reports "bytes_used" in the queued v4l2-bufferthere has been a long-standing bug-report with the [v4l2loopback](https://github.com/umlaeute/v4l2loopback/issues/190) kernel module, where `ffmpeg` reports errors when reading from `v4l2loopback` with streams provided by GStreamer's `v4...there has been a long-standing bug-report with the [v4l2loopback](https://github.com/umlaeute/v4l2loopback/issues/190) kernel module, where `ffmpeg` reports errors when reading from `v4l2loopback` with streams provided by GStreamer's `v4l2sink`.
the pipeline is launched as follows (with `/dev/video2` being a loopback device)
```sh
$ gst-launch-1.0 videotestsrc ! video/"x-raw,width=320,height=240" ! videoconvert ! v4l2sink device=/dev/video2
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Redistribute latency...
New clock: GstSystemClock
0:00:01.4 / 99:99:99.
```
reading the device with `ffmpeg` gives us something like this:
```sh
$ ffplay -hide_banner -i /dev/video2
Input #0, video4linux2,v4l2, from '/dev/video2':
Duration: N/A, start: 433809.417170, bitrate: 36864 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 320x240, 36864 kb/s, 30 fps, 30 tbr, 1000k tbn
[video4linux2,v4l2 @ 0x7fff70000c80] Dequeued v4l2 buffer contains 155648 bytes, but 153600 were expected. Flags: 0x00006001.
[video4linux2,v4l2 @ 0x7fff70000c80] Dequeued v4l2 buffer contains 155648 bytes, but 153600 were expected. Flags: 0x00006001.
[video4linux2,v4l2 @ 0x7fff70000c80] Dequeued v4l2 buffer contains 155648 bytes, but 153600 were expected. Flags: 0x00006001.
```
typically, ffmpeg freezes the image in this case.
afaict, the problem is that the `struct v4l2_buffer` has two members `bytesused` and `length`
```
struct v4l2_buffer {
__u32 index;
__u32 type;
__u32 bytesused;
// ...
__u32 length;
```
with the [following semantics](https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/buffer.html):
| member | descr |
|--------|-------|
| bytesused | The number of bytes occupied by the data in the buffer. It depends on the negotiated data format and may change with each buffer for compressed variable size data like JPEG images. Drivers must set this field when type refers to a capture stream, applications when it refers to an output stream. [...] |
| length | Size of the buffer (not the payload) in bytes for the single-planar API. [...] |
in the above example we observe that the payload on the video device is a 320x240 YUY2 stream, which has a nominal payload of **320x240x2=153600** bytes.
on a system with a pagesize of 4096 bytes, a buffer will be padded to **155648** bytes.
it seems that the problem really is, that `ffmpeg` knows that the payload is **153600**, and if it receives buffers that claim that `bytesused=155648` then it complains that something is wrong.
so who is to blame?
- `ffmpeg` might be a bit relaxed about too large buffers, however it is correct in complaining that the reported payload-size does not match the actual payload size.
- `v4l2loopback` simply copies the buffer metadata from the OUTPUT stream to the CAPTURE stream. i think this is reasonable, as `v4l2loopback` tries to be a very thin connecting layer (but then, i'm the v4l2loopback upstream so i might be biased)
- GStreamer's `v4l2sink` is the one who sets both the `bytesused` and the `length` fields of the buffers. i argue that it should properly set the `bytesused` field to the actual payload size.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2531I get an error in wasapi2src.2023-04-28T11:42:47Zgtk2kI get an error in wasapi2src.Windows Server 2022 build: 20348
gst-launch-1.0 version 1.22.2
GStreamer 1.22.2
I got the Windows Server 2022 environment, so I tried the following command.
```
gst-launch-1.0 webrtcbin name=sendrecv bundle-policy=max-bundle stun-...Windows Server 2022 build: 20348
gst-launch-1.0 version 1.22.2
GStreamer 1.22.2
I got the Windows Server 2022 environment, so I tried the following command.
```
gst-launch-1.0 webrtcbin name=sendrecv bundle-policy=max-bundle stun-server=stun://stun.l.google.com:19302 wasapi2src loopback-target-pid=0x00001D30 ! audioconvert ! opusenc ! rtpopuspay ! application/x-rtp,media=audio,encoding-name=OPUS,payload=96 ! sendrecv.
```
But I get the below error.
```
ERROR: from element /GstPipeline:pipeline0/GstWasapi2Src:wasapi2src0: Could not open resource for reading.
Additional debug info:
../sys/wasapi2/gstwasapi2ringbuffer.cpp(328): gst_wasapi2_ring_buffer_post_open_error (): /GstPipeline:pipeline0/GstWasapi2Src:wasapi2src0:
```https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2530playsink: the element never transitions into playing state with audio and video2023-04-28T09:47:41ZAndoni Morales Alastrueyplaysink: the element never transitions into playing state with audio and videoThe following pipeline using playsink with audio and video doesn't transition into playing state:
`gst-launch-1.0 playsink name=ps audiotestsrc ! ps.audio_raw_sink videotestsrc ! "video/x-raw" ! ps.video_raw_sink`
```
➜ gstreamer gi...The following pipeline using playsink with audio and video doesn't transition into playing state:
`gst-launch-1.0 playsink name=ps audiotestsrc ! ps.audio_raw_sink videotestsrc ! "video/x-raw" ! ps.video_raw_sink`
```
➜ gstreamer git: ✗ gst-launch-1.0 playsink name=ps audiotestsrc ! ps.audio_raw_sink videotestsrc ! "video/x-raw" ! ps.video_raw_sink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Got context from element 'videosink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayCocoa\)\ gldisplaycocoa0";
0:00:00.074332000 58323 0x148e68c90 ERROR glcaopengllayer gstglcaopengllayer.m:161:-[GstGLCAOpenGLLayer copyCGLContextForPixelFormat:]: failed to retrieve GStreamer GL context in CAOpenGLLayer
Redistribute latency...
```
Live:
```
➜ gstreamer git: ✗ gst-launch-1.0 playsink name=ps audiotestsrc is-live=true ! ps.audio_raw_sink videotestsrc is-live=true ! "video/x-raw" ! ps.video_raw_sink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got context from element 'videosink': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayCocoa\)\ gldisplaycocoa0";
0:00:00.071120000 58384 0x1367c5aa0 ERROR glcaopengllayer gstglcaopengllayer.m:161:-[GstGLCAOpenGLLayer copyCGLContextForPixelFormat:]: failed to retrieve GStreamer GL context in CAOpenGLLayer
Redistribute latency..
```
Audio-only or Video-only works as expectedhttps://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/345webrtcsink: panic when using `Homegrown` congestion control2023-06-06T20:35:02ZVictor Alvarezwebrtcsink: panic when using `Homegrown` congestion controlHi there,
we are seeing the following crash from time to time (I don't have the full trace right now, but error is helpful):
```rust
thread '<unnamed>' panicked at 'failed to upgrade `element` (if you don't want to panic, use @default...Hi there,
we are seeing the following crash from time to time (I don't have the full trace right now, but error is helpful):
```rust
thread '<unnamed>' panicked at 'failed to upgrade `element` (if you don't want to panic, use @default-return)', net/webrtc/src/webrtcsink/imp.rs:1658:33
```
It is related to line `13` in the below piece of code (current line: https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/blob/main/net/webrtc/src/webrtcsink/imp.rs#L2155) available if the homegrown congestion control is enabled. Our line numbers are different because we haven't updated the plugin to its newest version yet, however, those lines are the same still.
```rust
1 if session.congestion_controller.is_some() {
2 let session_id_str = session_id.to_string();
3 if session.stats_sigid.is_none() {
4 session.stats_sigid = Some(rtpbin.connect_closure("on-new-ssrc", true,
5 glib::closure!(@weak-allow-none element, @weak-allow-none webrtcbin
6 => move |rtpbin: gst::Object, session_id: u32, _src: u32| {
7 let rtp_session = rtpbin.emit_by_name::<gst::Element>("get-session", &[&session_id]);
8 let element = element.expect("on-new-ssrc emited when webrtcsink has been disposed?");
9 let webrtcbin = webrtcbin.unwrap();
10 let mut state = element.imp().state.lock().unwrap();
11 if let Some(mut session) = state.sessions.get_mut(&session_id_str) {
12 session.stats_sigid = Some(rtp_session.connect_notify(Some("twcc-stats"),
13 glib::clone!(@strong session_id_str, @weak webrtcbin, @weak element => @default-panic, move |sess, pspec| {
14 // Run the Loss-based control algorithm on new peer TWCC feedbacks
15 element.imp().process_loss_stats(&element, &session_id_str, &sess.property::<gst::Structure>(pspec.name()));
16 })
17 ));
18 }
19 })
20 ));
21 }
22 }
```
It looks to me as a race condition between the weak reference in the closure and the actual elements getting destroyed about the same time (that's the reason why it's a bit difficult to reproduce). It seems to me that `@default-return` that signals an error and propagating it upstream to avoid that the corresponding session to start could be the solution. Also, why not passing strong references here?
Also, in line `5`, `None` is allowed, but in the body, they are `expect`'d and `unwrap`'d, which would result in the process going down as well if for some reason `None` is found. Wouldn't these situations be better handled if the corresponding session simply does not start on any such kind of errors?
Thanks in advance!https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2528Problem on executing WebRTC pipeline on Android2023-04-27T10:12:22ZReza Alizadeh MajdProblem on executing WebRTC pipeline on AndroidWe are trying to port an existing WerbRTC pipeline used for Audio/Video call to Android, the application is based on Qt. and the pipeline is as follows:
1. we get prepare the pipeline for audio:
```cpp
GstElement *source = gs...We are trying to port an existing WerbRTC pipeline used for Audio/Video call to Android, the application is based on Qt. and the pipeline is as follows:
1. we get prepare the pipeline for audio:
```cpp
GstElement *source = gst_element_factory_make("autoaudiosrc", nullptr);
GstElement *volume = gst_element_factory_make("volume", "srclevel");
GstElement *convert = gst_element_factory_make("audioconvert", nullptr);
GstElement *resample = gst_element_factory_make("audioresample", nullptr);
GstElement *queue1 = gst_element_factory_make("queue", nullptr);
GstElement *opusenc = gst_element_factory_make("opusenc", nullptr);
GstElement *rtp = gst_element_factory_make("rtpopuspay", nullptr);
GstElement *queue2 = gst_element_factory_make("queue", nullptr);
GstElement *capsfilter = gst_element_factory_make("capsfilter", nullptr);
GstCaps *rtpcaps = gst_caps_new_simple("application/x-rtp",
"media", G_TYPE_STRING, "audio",
"encoding-name", G_TYPE_STRING, "OPUS",
"payload", G_TYPE_INT, opusPayloadType,
nullptr);
g_object_set(capsfilter, "caps", rtpcaps, nullptr);
gst_caps_unref(rtpcaps);
GstElement *webrtcbin = gst_element_factory_make("webrtcbin", "webrtcbin");
g_object_set(webrtcbin, "bundle-policy", GST_WEBRTC_BUNDLE_POLICY_MAX_BUNDLE, nullptr);
pipe_ = gst_pipeline_new(nullptr);
gst_bin_add_many(GST_BIN(pipe_), source, volume, convert, resample,
queue1, opusenc, rtp, queue2, capsfilter, webrtcbin, nullptr);
gst_element_link_many(source, volume, convert, resample, queue1,
opusenc, rtp, queue2, capsfilter, webrtcbin, nullptr);
```
2. we also try to attach the local camera stream to pipeline using:
```cpp
GstElement *videoconvert = gst_element_factory_make("videoconvert", nullptr);
GstElement *tee = gst_element_factory_make("tee", "videosrctee");
gst_bin_add_many(GST_BIN(pipe_), videoconvert, tee, nullptr);
std::pair<int, int> resolution ={ 640, 480 };
std::pair<int, int> frameRate ={ 30, 1 };
GstElement *camera = gst_element_factory_make("ahcsrc", nullptr);
GstCaps *caps = gst_caps_new_simple("video/x-raw",
"width", G_TYPE_INT, resolution.first,
"height", G_TYPE_INT, resolution.second,
"framerate", GST_TYPE_FRACTION, frameRate.first, frameRate.second,
nullptr);
camerafilter = gst_element_factory_make("capsfilter", "camerafilter");
g_object_set(camerafilter, "caps", caps, nullptr);
gst_caps_unref(caps);
gst_bin_add_many(GST_BIN(pipe_), camera, camerafilter, nullptr);
if (!gst_element_link_many(camera, videoconvert, camerafilter, nullptr)) {
return false;
}
if (callType_ == CallType::VIDEO && !gst_element_link(camerafilter, tee)) {
return false;
}
```
3. then we attached the video stream pipeline
```cpp
GstElement *queue = gst_element_factory_make("queue", nullptr);
GstElement *vp8enc = gst_element_factory_make("vp8enc", nullptr);
g_object_set(vp8enc, "deadline", 1, nullptr);
g_object_set(vp8enc, "error-resilient", 1, nullptr);
GstElement *rtpvp8pay = gst_element_factory_make("rtpvp8pay", nullptr);
GstElement *rtpqueue = gst_element_factory_make("queue", nullptr);
GstElement *rtpcapsfilter = gst_element_factory_make("capsfilter", nullptr);
GstCaps *rtpcaps = gst_caps_new_simple("application/x-rtp",
"media", G_TYPE_STRING, "video",
"encoding-name", G_TYPE_STRING, "VP8",
"payload", G_TYPE_INT, vp8PayloadType,
nullptr);
g_object_set(rtpcapsfilter, "caps", rtpcaps, nullptr);
gst_caps_unref(rtpcaps);
gst_bin_add_many(GST_BIN(pipe_), queue, vp8enc, rtpvp8pay, rtpqueue, rtpcapsfilter, nullptr);
GstElement *webrtcbin = gst_bin_get_by_name(GST_BIN(pipe_), "webrtcbin");
if (!gst_element_link_many(tee, queue, vp8enc, rtpvp8pay, rtpqueue, rtpcapsfilter, webrtcbin, nullptr)) {
gst_object_unref(webrtcbin);
return false;
}
gst_object_unref(webrtcbin);
```
4. finally we pass the pipeline to a `_videoItem` of `GstGLVideoItem` type:
```cpp
GstElement *queue = gst_element_factory_make("queue", nullptr);
GstElement *compositor = gst_element_factory_make("compositor", "compositor");
GstElement *glupload = gst_element_factory_make("glupload", nullptr);
GstElement *glcolorconvert = gst_element_factory_make("glcolorconvert", nullptr);
GstElement *qmlglsink = gst_element_factory_make("qmlglsink", nullptr);
GstElement *glsinkbin = gst_element_factory_make("glsinkbin", nullptr);
g_object_set(compositor, "background", 1, nullptr);
g_object_set(qmlglsink, "widget", _videoItem, nullptr);
g_object_set(glsinkbin, "sink", qmlglsink, nullptr);
gst_bin_add_many(GST_BIN(pipe), queue, compositor, glupload, glcolorconvert, glsinkbin, nullptr);
gst_element_link_many(queue, compositor, glupload, glcolorconvert, glsinkbin, nullptr);
gst_element_sync_state_with_parent(queue);
gst_element_sync_state_with_parent(compositor);
gst_element_sync_state_with_parent(glupload);
gst_element_sync_state_with_parent(glcolorconvert);
gst_element_sync_state_with_parent(glsinkbin);
```
following the mentioned steps, the camera usage indicator turns on on the Android device, but we don't receive any stream on either side, and on Mobile we just receive a black screen
I also attached the application logs including the GStreamer logs.
[gst-test-4.26.6.log](/uploads/71a0c1b2ca4b6c9c98eee5ae4ad748a3/gst-test-4.26.6.log)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2527[2.0] {audio,video}rate: set skip-to-first=true as default2023-04-26T13:15:11ZGuillaume Desmottes[2.0] {audio,video}rate: set skip-to-first=true as defaultI've been hit again by `skip-to-first=false` being the default on `videorate`. Opening this ticket so we can change the default for 2.0I've been hit again by `skip-to-first=false` being the default on `videorate`. Opening this ticket so we can change the default for 2.0https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2526matroskamux: file not seekable maybe due to timecode rounding problem2023-11-07T11:45:53ZNicoDeLargematroskamux: file not seekable maybe due to timecode rounding problem### Describe your issue
I am using an app that uses gstreamer to export application internal video streams to stand alone MKV files. Unfortunately, the files generated can be played by e.g. VLC but are not seekable (i.e. I cannot jump to...### Describe your issue
I am using an app that uses gstreamer to export application internal video streams to stand alone MKV files. Unfortunately, the files generated can be played by e.g. VLC but are not seekable (i.e. I cannot jump to specific positions in the video).
#### Expected Behavior
The videos should be seekable.
#### Observed Behavior
The video is not seekable.
#### Setup
- **Operating System:** Windows 10
- **Device:** Computer
- **GStreamer Version:** 1.22.2
- **Command line:** "jpegdec ! videoconvert ! nvh264enc ! h264parse" (source and sink part of the pipeline are controlled by the app, only the pipeline between can be adapted by the user)
### Steps to reproduce the bug
### How reproducible is the bug?
Every time exporting/generating a MKV using the app.
### Screenshots if relevant
### Solutions you have tried
- Using different input files
- Using different pipelines
- Adapting matroskamux properties (not possible due to limitations in app)
### Related non-duplicate issues
### Additional Information
mkvalidator-0.6 gives the following results:
```
230331_091006.adtfdat_export_Video.mkv
ERR201: Invalid 'Colour' for profile 'matroska v2' in Video at 371
ERR201: Invalid 'Range' for profile 'matroska v2' in Colour at 391
ERR201: Invalid 'MatrixCoefficients' for profile 'matroska v2' in Colour at 391
ERR201: Invalid 'TransferCharacteristics' for profile 'matroska v2' in Colour at 391
ERR201: Invalid 'Primaries' for profile 'matroska v2' in Colour at 391
ERR312: CueEntry Track #1 and timecode g80249104967 ms not found
ERR312: CueEntry Track #1 and timecode g80249119965 ms not found
ERR312: CueEntry Track #1 and timecode g80249164966 ms not found
ERR312: CueEntry Track #1 and timecode g80249194964 ms not found
ERR312: CueEntry Track #1 and timecode g80249209963 ms not found
file "D:\20230331_091006.adtfdat_export_Video.mkv"
created with GStreamer matroskamux version 1.22.2 / GStreamer Matroska muxer
```
mkvtoolnix shows, that the cues and the referenced clusters have the same timecode, but the first SimpleBlock of the Cluster has sometimes a timecode that differs by 1*timecodescale from the timecode of the cluster (see e.g. timestamps in lines 43 vs. 44 or 195 vs 197 in mkvinfo below)
[mkvinfo_full.txt](/uploads/616845ad825159636fd1a5c7b25652bb/mkvinfo_full.txt)
I am totally new to gstreamer and video processing, and may be wrong in my conclusion, but dig down the code a little and I suppose the core issues is matroskamux does a [rounding of the time codes for the *Blocks*](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-good/gst/matroska/matroska-mux.c#L4214), that in some cases (maybe related with inappropriate timecodescale), leads to +-1 values. Whereas the timecode in the *Clusters* is [taken (quite) straight from the buffer_timestamp](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-plugins-good/gst/matroska/matroska-mux.c#L4158)
Unfortunately, the app does not allow me to configure the properties of matroskamux and try another timecodescale, so my conclusion is until now mainly based on "theoretical analysis".https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2525Not getting exact size of buffer from appsink after audioresample2023-04-26T14:41:26ZBhoomil ChavdaNot getting exact size of buffer from appsink after audioresampleI am using below vesion of pipeline in my C/C++ code,
`appsrc ! audioconvert ! audioresample ! "audio/x-raw, rate=48000, format=F32LE" ! appsink`
Basically there are two threads,
First thread is push buffer using `gst_app_src_push_buffe...I am using below vesion of pipeline in my C/C++ code,
`appsrc ! audioconvert ! audioresample ! "audio/x-raw, rate=48000, format=F32LE" ! appsink`
Basically there are two threads,
First thread is push buffer using `gst_app_src_push_buffer` by capture buffer from ALSA blocking API pcm_readi at every 4ms interval.
Second thread is pull buffer using `gst_app_sink_pull_sample` blocking call.
When ALSA capture device input is changed from 48KHz to 44.1KHz or 96KHz I have putted my pipline into NULL and then update appsrc caps property and then PLAY again.
My code is working fine when input is 48KHz S16LE i.e. 768bytes input and 48KHz F32LE i.e. 1536 bytes output.
But having issue with 44.1KHz and 96KHz.
For example in case of 96KHz I am getting 1280 bytes of buffer and then 7 times 1536 8th location again 1280 so on ...
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
As per theory, input push buffer len is 1536 bytes having 96KHz rate and after resample 1536 bytes having 48KHz output but why this 1280 size of buffer is coming. This is push model of appsrc I'm using without utilsing need-data enough-data signals. Am I missing any property setting with appsrc.https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/issues/995Not getting exact size of buffer from appsink after audioresample2023-04-26T11:20:05ZBhoomil ChavdaNot getting exact size of buffer from appsink after audioresampleI am using below vesion of pipeline in my C/C++ code,
`appsrc ! audioconvert ! audioresample ! "audio/x-raw, rate=48000, format=F32LE" ! appsink`
Basically there are two threads,
First thread is push buffer using `gst_app_src_push_buffe...I am using below vesion of pipeline in my C/C++ code,
`appsrc ! audioconvert ! audioresample ! "audio/x-raw, rate=48000, format=F32LE" ! appsink`
Basically there are two threads,
First thread is push buffer using `gst_app_src_push_buffer` by capture buffer from ALSA blocking API pcm_readi at every 4ms interval.
Second thread is pull buffer using `gst_app_sink_pull_sample` blocking call.
When ALSA capture device input is changed from 48KHz to 44.1KHz or 96KHz I have putted my pipline into NULL and then update appsrc caps property and then PLAY again.
My code is working fine when input is 48KHz S16LE i.e. 768bytes input and 48KHz F32LE i.e. 1536 bytes output.
But having issue with 44.1KHz and 96KHz.
For example in case of 96KHz I am getting 1280 bytes of buffer and then 7 times 1536 8th location again 1280 so on ...
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1536
Pull Buffer size -> 1280 <--- wrong
As per theory, input push buffer len is 1536 bytes having 96KHz rate and after resample 1536 bytes having 48KHz output but why this 1280 size of buffer is coming. This is push model of appsrc I'm using without utilsing need-data enough-data signals. Am I missing any property setting with appsrc.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2524wasapi2src on Windows10 21H22023-04-30T15:15:38Zgtk2kwasapi2src on Windows10 21H2https://gstreamer.freedesktop.org/download/
I downloaded the "1.22.2 runtime installer" from here and installed it on Windows10 21H2.
But wasapi2src doesn't have loopback-target-pid property.
How can I use wasapi2src with loopback-target...https://gstreamer.freedesktop.org/download/
I downloaded the "1.22.2 runtime installer" from here and installed it on Windows10 21H2.
But wasapi2src doesn't have loopback-target-pid property.
How can I use wasapi2src with loopback-target-pid?
```
> gst-inspect-1.0 wasapi2src
Factory Details:
Rank primary + 1 (257)
Long-name Wasapi2Src
Klass Source/Audio/Hardware
Description Stream audio from an audio capture device through WASAPI
Author Nirbheek Chauhan <nirbheek@centricular.com>, Ole André Vadla Ravnås <ole.andre.ravnas@tandberg.com>, Seungha Yang <seungha@centricular.com>
Documentation https://gstreamer.freedesktop.org/documentation/wasapi2/wasapi2src.html
Plugin Details:
Name wasapi2
Description Windows audio session API plugin
Filename D:\gstreamer\1.0\msvc_x86_64\lib\gstreamer-1.0\gstwasapi2.dll
Version 1.22.2
License LGPL
Source module gst-plugins-bad
Documentation https://gstreamer.freedesktop.org/documentation/wasapi2/
Source release date 2023-04-11
Binary package GStreamer Bad Plug-ins source release
Origin URL Unknown package origin
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstBaseSrc
+----GstPushSrc
+----GstAudioBaseSrc
+----GstWasapi2Src
Implemented Interfaces:
GstStreamVolume
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
audio/x-raw
format: { (string)F64LE, (string)F64BE, (string)F32LE, (string)F32BE, (string)S32LE, (string)S32BE, (string)U32LE, (string)U32BE, (string)S24_32LE, (string)S24_32BE, (string)U24_32LE, (string)U24_32BE, (string)S24LE, (string)S24BE, (string)U24LE, (string)U24BE, (string)S20LE, (string)S20BE, (string)U20LE, (string)U20BE, (string)S18LE, (string)S18BE, (string)U18LE, (string)U18BE, (string)S16LE, (string)S16BE, (string)U16LE, (string)U16BE, (string)S8, (string)U8 }
layout: interleaved
rate: [ 1, 2147483647 ]
channels: [ 1, 2147483647 ]
Clocking Interaction:
element is supposed to provide a clock but returned NULL
Element has no URI handling capabilities.
Pads:
SRC: 'src'
Pad Template: 'src'
Element Properties:
actual-buffer-time : Actual configured size of audio buffer in microseconds
flags: readable
Integer64. Range: -1 - 9223372036854775807 Default: -1
actual-latency-time : Actual configured audio latency in microseconds
flags: readable
Integer64. Range: -1 - 9223372036854775807 Default: -1
blocksize : Size in bytes to read per buffer (-1 = default)
flags: readable, writable
Unsigned Integer. Range: 0 - 4294967295 Default: 0
buffer-time : Size of audio buffer in microseconds. This is the maximum amount of data that is buffered in the device and the maximum latency that the source reports. This value might be ignored by the element if necessary; see "actual-buffer-time"
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 200000
device : WASAPI playback device as a GUID string
flags: readable, writable, changeable only in NULL or READY state
String. Default: null
dispatcher : ICoreDispatcher COM object to use. In order for application to ask permission of audio device, device activation should be running on UI thread via ICoreDispatcher. This element will increase the reference count of given ICoreDispatcher and release it after use. Therefore, caller does not need to consider additional reference count management
flags: writable, changeable only in NULL or READY state
Pointer. Write only
do-timestamp : Apply current stream time to buffers
flags: readable, writable
Boolean. Default: false
latency-time : The minimum amount of data to read in each iteration in microseconds. This is the minimum latency that the source reports. This value might be ignored by the element if necessary; see "actual-latency-time"
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 10000
loopback : Open render device for loopback recording
flags: readable, writable, changeable only in NULL or READY state
Boolean. Default: false
low-latency : Optimize all settings for lowest latency. Always safe to enable.
flags: readable, writable, changeable only in NULL or READY state
Boolean. Default: false
mute : Mute state of this stream
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
Boolean. Default: false
name : The name of the object
flags: readable, writable
String. Default: "wasapi2src0"
num-buffers : Number of buffers to output before sending EOS (-1 = unlimited)
flags: readable, writable
Integer. Range: -1 - 2147483647 Default: -1
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
provide-clock : Provide a clock to be used as the global pipeline clock
flags: readable, writable
Boolean. Default: true
slave-method : Algorithm used to match the rate of the masterclock
flags: readable, writable
Enum "GstAudioBaseSrcSlaveMethod" Default: 2, "skew"
(0): resample - GST_AUDIO_BASE_SRC_SLAVE_RESAMPLE
(1): re-timestamp - GST_AUDIO_BASE_SRC_SLAVE_RE_TIMESTAMP
(2): skew - GST_AUDIO_BASE_SRC_SLAVE_SKEW
(3): none - GST_AUDIO_BASE_SRC_SLAVE_NONE
typefind : Run typefind before negotiating (deprecated, non-functional)
flags: readable, writable, deprecated
Boolean. Default: false
volume : Volume of this stream
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
Double. Range: 0 - 1 Default: 1
```https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1773wasapi2src on Windows10 21H22023-04-26T10:44:12Zgtk2kwasapi2src on Windows10 21H2https://gstreamer.freedesktop.org/download/
I downloaded the "1.22.2 runtime installer" from here and installed it on Windows10 21H2.
But wasapi2src doesn't have loopback-target-pid property.
How can I use wasapi2src with loopback-target...https://gstreamer.freedesktop.org/download/
I downloaded the "1.22.2 runtime installer" from here and installed it on Windows10 21H2.
But wasapi2src doesn't have loopback-target-pid property.
How can I use wasapi2src with loopback-target-pid?
```
> gst-inspect-1.0 wasapi2src
Factory Details:
Rank primary + 1 (257)
Long-name Wasapi2Src
Klass Source/Audio/Hardware
Description Stream audio from an audio capture device through WASAPI
Author Nirbheek Chauhan <nirbheek@centricular.com>, Ole André Vadla Ravnås <ole.andre.ravnas@tandberg.com>, Seungha Yang <seungha@centricular.com>
Documentation https://gstreamer.freedesktop.org/documentation/wasapi2/wasapi2src.html
Plugin Details:
Name wasapi2
Description Windows audio session API plugin
Filename D:\gstreamer\1.0\msvc_x86_64\lib\gstreamer-1.0\gstwasapi2.dll
Version 1.22.2
License LGPL
Source module gst-plugins-bad
Documentation https://gstreamer.freedesktop.org/documentation/wasapi2/
Source release date 2023-04-11
Binary package GStreamer Bad Plug-ins source release
Origin URL Unknown package origin
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstBaseSrc
+----GstPushSrc
+----GstAudioBaseSrc
+----GstWasapi2Src
Implemented Interfaces:
GstStreamVolume
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
audio/x-raw
format: { (string)F64LE, (string)F64BE, (string)F32LE, (string)F32BE, (string)S32LE, (string)S32BE, (string)U32LE, (string)U32BE, (string)S24_32LE, (string)S24_32BE, (string)U24_32LE, (string)U24_32BE, (string)S24LE, (string)S24BE, (string)U24LE, (string)U24BE, (string)S20LE, (string)S20BE, (string)U20LE, (string)U20BE, (string)S18LE, (string)S18BE, (string)U18LE, (string)U18BE, (string)S16LE, (string)S16BE, (string)U16LE, (string)U16BE, (string)S8, (string)U8 }
layout: interleaved
rate: [ 1, 2147483647 ]
channels: [ 1, 2147483647 ]
Clocking Interaction:
element is supposed to provide a clock but returned NULL
Element has no URI handling capabilities.
Pads:
SRC: 'src'
Pad Template: 'src'
Element Properties:
actual-buffer-time : Actual configured size of audio buffer in microseconds
flags: readable
Integer64. Range: -1 - 9223372036854775807 Default: -1
actual-latency-time : Actual configured audio latency in microseconds
flags: readable
Integer64. Range: -1 - 9223372036854775807 Default: -1
blocksize : Size in bytes to read per buffer (-1 = default)
flags: readable, writable
Unsigned Integer. Range: 0 - 4294967295 Default: 0
buffer-time : Size of audio buffer in microseconds. This is the maximum amount of data that is buffered in the device and the maximum latency that the source reports. This value might be ignored by the element if necessary; see "actual-buffer-time"
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 200000
device : WASAPI playback device as a GUID string
flags: readable, writable, changeable only in NULL or READY state
String. Default: null
dispatcher : ICoreDispatcher COM object to use. In order for application to ask permission of audio device, device activation should be running on UI thread via ICoreDispatcher. This element will increase the reference count of given ICoreDispatcher and release it after use. Therefore, caller does not need to consider additional reference count management
flags: writable, changeable only in NULL or READY state
Pointer. Write only
do-timestamp : Apply current stream time to buffers
flags: readable, writable
Boolean. Default: false
latency-time : The minimum amount of data to read in each iteration in microseconds. This is the minimum latency that the source reports. This value might be ignored by the element if necessary; see "actual-latency-time"
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 10000
loopback : Open render device for loopback recording
flags: readable, writable, changeable only in NULL or READY state
Boolean. Default: false
low-latency : Optimize all settings for lowest latency. Always safe to enable.
flags: readable, writable, changeable only in NULL or READY state
Boolean. Default: false
mute : Mute state of this stream
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
Boolean. Default: false
name : The name of the object
flags: readable, writable
String. Default: "wasapi2src0"
num-buffers : Number of buffers to output before sending EOS (-1 = unlimited)
flags: readable, writable
Integer. Range: -1 - 2147483647 Default: -1
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
provide-clock : Provide a clock to be used as the global pipeline clock
flags: readable, writable
Boolean. Default: true
slave-method : Algorithm used to match the rate of the masterclock
flags: readable, writable
Enum "GstAudioBaseSrcSlaveMethod" Default: 2, "skew"
(0): resample - GST_AUDIO_BASE_SRC_SLAVE_RESAMPLE
(1): re-timestamp - GST_AUDIO_BASE_SRC_SLAVE_RE_TIMESTAMP
(2): skew - GST_AUDIO_BASE_SRC_SLAVE_SKEW
(3): none - GST_AUDIO_BASE_SRC_SLAVE_NONE
typefind : Run typefind before negotiating (deprecated, non-functional)
flags: readable, writable, deprecated
Boolean. Default: false
volume : Volume of this stream
flags: readable, writable, changeable in NULL, READY, PAUSED or PLAYING state
Double. Range: 0 - 1 Default: 1
```