GStreamer issueshttps://gitlab.freedesktop.org/groups/gstreamer/-/issues2024-01-24T01:37:44Zhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3244GstBin `async-handling` mode is quite broken2024-01-24T01:37:44ZJan SchmidtGstBin `async-handling` mode is quite brokenThe `async-handling` setting on `GstBin` is supposed to make a bin inside the pipeline hide internal asynchronous state changes from the outside world. It does that primarily by absorbing `async-start` and `async-done` messages and not f...The `async-handling` setting on `GstBin` is supposed to make a bin inside the pipeline hide internal asynchronous state changes from the outside world. It does that primarily by absorbing `async-start` and `async-done` messages and not forwarding them to the parent bin, and by attempting to "continue state" itself on `async-done` as if it were a toplevel bin (where most elements rely on their parent bin to continue any in-progress state change on `async-done`.
However, there's a big problem: the bin's state change method still returns `GST_STATE_CHANGE_ASYNC` to the parent bin. The effect of that is that the top-level bin receives `GST_STATE_CHANGE_ASYNC`, and goes looking to see if any child elements are still changing state asynchronously or if they finished already. Since it doesn't find any incomplete stored `async-start` messages (the child bin with `async-handling=true` never posted one), it assumes the reported async state change is complete, and tries to continue to the next state... receives `GST_STATE_CHANGE_ASYNC` from the child bin, and iterates consuming 100% CPU until the underlying state change finally does complete.
The net effect is that the top-level pipeline still ends up waiting for the underlying async state change, despite `async-handling=true`, but it does so busy-waiting at 100% CPU instead of waiting quietly for an `async-done` message like it normally would.
Now, if we try changing the bin state-change function to not return `GST_STATE_CHANGE_ASYNC`, what happens is that the bin and the top-level pipeline instead immediately proceed to the final state (PLAYING, for example). Inside the `async-handling` bin, when it does finally receive `async-done`, the logic now looks only at the bin's state, see that it's already in the final state, and does not proceed to change the state of any async child elements - they get left at the old state that was happening async (usually PAUSED).
Also, along the way, the bin will continue trying to change the state of each child element with each state change step. Some children might be async and stop at READY->PAUSED returning `GST_STATE_CHANGE_ASYNC`. Most elements will change state synchronously all the way to the final state, but because any state change might be async, we could also end up in a situation where one element inside the bin is changing state asynchronously from NULL->READY, and another from READY->PAUSED. Really in that situation, the elements inside the bin should all continue to change state step-by-step together, waiting as necessary for async changes. That would require some more separate book keeping about what the 'internal state' of the bin is, and completely changing the way bins handle things `gst_bin_change_state_func` when `async-handling=true`
Finally though, there's a bigger fundamental problem with having `async-handling=true` hide child state changes from the top-level bin, which is that the top-level pipeline can no longer do step-by-step state changes topologically and guarantee that a downstream element will have reached a state where it is able to receive data before data starts flowing in. Fixing that requires external knowledge from the application, or a fundamental change in how data-flow is activated between all elements in a pipeline.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3243csharp: support newer 'dotnet format' variant2024-01-23T11:17:41ZPiotr Brzezińskicsharp: support newer 'dotnet format' variantCurrent `pre-commit.hook`/`format-csharp` looks for `dotnet-format`, which is outdated according to their [GitHub readme](https://github.com/dotnet/format). The new version is just `dotnet format`, and I can confirm that at least on macO...Current `pre-commit.hook`/`format-csharp` looks for `dotnet-format`, which is outdated according to their [GitHub readme](https://github.com/dotnet/format). The new version is just `dotnet format`, and I can confirm that at least on macOS, with the .NET 8.0 SDK installed, `dotnet-format` doesn't exist anymore and the precommit hook fails. Would be good to figure out if we can just replace the old command, or if both variants have to be supported.
cc @ylatuyahttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3241gst_bin_recalculate_latency() doesnt work (Latency calculation fails)2024-02-04T07:00:55ZIsrael Robotnickgst_bin_recalculate_latency() doesnt work (Latency calculation fails)### Describe your issue
Since Gst 1.22, I get a warning message repeatedly "Can't determine running time for this packet without knowing configured latency" message from rtpsession element.
This should happen if using gst_bin_recalculat...### Describe your issue
Since Gst 1.22, I get a warning message repeatedly "Can't determine running time for this packet without knowing configured latency" message from rtpsession element.
This should happen if using gst_bin_recalculate_latency() on the pipeline element every time a LATENCY message goes on the bus.
Attached two fils: logs level 6 + pipeline graph
#### Expected Behavior
Latency calculation should work, no Can't determine running time for this packet without knowing configured latency" message
#### Observed Behavior
Ï still get the message.
More over, I do gst_bin_recalculate_latency() on the pipeline when elements signal "LATENCY" on the bus, EXCEPT sink (fakesink, udpsink) elements. For some reason if I do this to sink elements as well, the pipeline stops. Still in my pipeline it the rtpsession is created
after any sink element, so I don't think it's an issue.
#### Setup
- **Operating System: Ubuntu 22.04
- **Device:** Virtual Machine
- **GStreamer Version: 1.22
- **Command line: Dynamic pipeline (python bindings)
The pipeline attached on picture (pipeline_debug.jpg) made by Gstreamer
### Steps to reproduce the bug
1. Create dynamic pipeline as in picture [pipeline_debug](/uploads/46bd1b112cbffbd6cedcba499b93845b/pipeline_debug.jpg)
udpsrc ! mpegtsparse ! tsdemux ! tee ! parsebin ! tee name=output ! queue2 ! fakesink output. ! h264timestamper ! rtph264pay ! webrtcbin
2. Listen to GST bus, to message Gst.MessageType.LATENCY
Whenever it happens, do pipelineElement.recalculate_latency()
### How reproducible is the bug?
Every time, also if I change the input element to rtspsrc.
### Screenshots if relevant
### Solutions you have tried
I silenced the log, but it doesnt really fix the issue.
I tried to change the input elements, and as I mentioned I disabled recalculate_latency when fakesink or udpsink creates the LATENCY message,
because it stops the whole pipeline. (Im watching the PTS by pad buffer probes of each packet in each pad in my pipeline, to know when something is wrong. And doing recalculate_latency causes the PTS to stop, meaning I stop getting buffers)
### Related non-duplicate issues
This message ("Can't determine running time for this packet without knowing configured latency") didn't happen on GST 1.20, but from issue 1659 I understand its just because it was silenced before 1.22)
https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1659
### Additional Information
[logswithcalc.txt](/uploads/7bbc34190c1be8f1f7401fe8ce0b24b4/logswithcalc.txt)
[pipeline_debug](/uploads/46bd1b112cbffbd6cedcba499b93845b/pipeline_debug.jpg)https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3240kmssink: Short green screen when doing modesetting with NV12 format2024-01-22T08:06:40ZJeffyChenjeffy.chen@rock-chips.comkmssink: Short green screen when doing modesetting with NV12 format### Describe your issue
When streaming a YUV video(for example NV12) to kmssink with modesetting enabled, there's a green screen at the beginning.
### Steps to reproduce the bug
1. Add a udelay(1000000) at the end of subprojects/gst-plu...### Describe your issue
When streaming a YUV video(for example NV12) to kmssink with modesetting enabled, there's a green screen at the beginning.
### Steps to reproduce the bug
1. Add a udelay(1000000) at the end of subprojects/gst-plugins-bad/sys/kms/gstkmssink.c's configure_mode_setting()
2. Run: gst-launch-1.0 videotestsrc ! 'video/x-raw,format=NV12' ! videoconvert ! kmssink force-modesetting=1
### Solutions you have tried
Force using BGRx format to allocate the initial empty buffer:
https://gitlab.freedesktop.org/JeffyCN/gstreamer/-/commit/d52d64b9f247553ccbf2883793efc50b96f9e197#7ea97ba6d9ea79409214401fbcce376f90c523f1_870_878
But that depends on some other downstream hacks.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3238timeout-inactive-sources causing audio sync issues2024-01-19T20:04:40ZGuru Govindantimeout-inactive-sources causing audio sync issuesI work on a number of RTSP camera sources and one of the issues that happen is that some of the gstreamer pipelines receive a lot of EOS and keep restarting. The root cause was that the data will stop coming on the source and it would re...I work on a number of RTSP camera sources and one of the issues that happen is that some of the gstreamer pipelines receive a lot of EOS and keep restarting. The root cause was that the data will stop coming on the source and it would result in the RTP session closing.
The fix [`timeout-inactive-rtp-sources`](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/commit/7445b73e766b1188983187600e1276f56f4abd28) by @meh to continue has been a huge help in resolving this.
But the issue now is that when there is an audio stream as well coming from the camera where I use `timeout-inactive-rtp-sources`, the audio and the video go out of sync.
I feel like we need to synchronize the timestamps between and audio and video after an event like this?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3234custom gst-python plugin is not working in python app2024-02-14T21:30:06ZVishal Kumarcustom gst-python plugin is not working in python appI have created a custom gst-python plugin for blurring the incoming frame.
This is the plugin code for my **blur** plugin:
```python
import gi
gi.require_version("Gst", "1.0")
gi.require_version("GstBase", "1.0")
from gi.repository i...I have created a custom gst-python plugin for blurring the incoming frame.
This is the plugin code for my **blur** plugin:
```python
import gi
gi.require_version("Gst", "1.0")
gi.require_version("GstBase", "1.0")
from gi.repository import Gst, GObject, GstBase, GLib
import numpy as np
import cv2
Gst.init(None)
DEFAULT_KERNEL_SIZE = 11
class Blur(GstBase.BaseTransform):
"""
Gstreamer Python Plugin for blurring incoming video buffers.
Args:
GstBase.BaseTransform: The base class for the plugin.
Methods:
do_get_property: Retrieves the value of a specified property.
do_set_property: Sets the value of a specified property.
do_set_caps: Extracts information from the input capabilities.
do_transform_ip: Implements the functionality for blurring the input GstBuffer.
"""
GST_PLUGIN_NAME = "blur"
# Plugin’s description is stored in gstmetadata field as tuple
# gst-inspect-1.0 blur
__gstmetadata__ = (
"Blur Plugin", # Long-name
"Blur Plugin", # Klass
"Plugin for blurring incoming video buffers", # Description
"vishal" # Author
"any queries @ vishalkmr01123@gmail.com", # Author
)
# Pad Template Definition
# Used for defining the capabilities of the inputs and outputs.
# Description of a pad that the element will create and use. it contains:
# - A short name for the pad.
# - Pad direction.
# - Existence property. This indicates whether the pad exists always
# (an "always" pad), only in some cases (a "sometimes" pad) or only if
# the application requested such a pad (a "request" pad).
# - Supported types/formats by this element (capabilities).
FORMATS = "{RGB,BGR}" # Allow only these format
src_format = Gst.Caps.from_string(f"video/x-raw,format={FORMATS}")
sink_format = Gst.Caps.from_string(f"video/x-raw,format={FORMATS}")
src_pad_template = Gst.PadTemplate.new(
"src",
Gst.PadDirection.SRC,
Gst.PadPresence.ALWAYS,
src_format
)
sink_pad_template = Gst.PadTemplate.new(
"sink",
Gst.PadDirection.SINK,
Gst.PadPresence.ALWAYS,
sink_format
)
__gsttemplates__ = (src_pad_template, sink_pad_template)
# Installing various properties for the plugin
__gproperties__ = {
"kernel-size": (int, # type
"Kernel Size", # nick
"Gaussian Kernel Size", # blurb
1, # min
GLib.MAXINT, # max
DEFAULT_KERNEL_SIZE, # default
GObject.ParamFlags.READWRITE # flags
),
}
def __init__(self):
"""
Initializes the Blur class object.
This is common to all instances of the Blur class.
Used to initialise the class only once.
"""
# Calling the __init__() method of the parent class.
super().__init__()
# Set default values for properties
self.kernel_size= DEFAULT_KERNEL_SIZE
def do_get_property(self, prop: GObject.ParamSpec) -> any:
"""
Retrieves the value of the specified property.
Args:
prop (GObject.ParamSpec): The property object containing information
about the property name, type, and flags.
Returns:
The value of the requested property.
Raises:
AttributeError: If the provided property name does not match any
known properties.
"""
if prop.name == 'kernel-size':
return self.kernel_size
else:
raise AttributeError(f"Unknown Property {prop.name}")
def do_set_property(self, prop: GObject.ParamSpec, value: any) -> None:
"""
Set the values of the properties in the GstBlink class based on
the provided property name and value.
Args:
prop (GObject.ParamSpec): The property object containing information
about the property name, type, and flags.
value (any): The value to be set for the property.
Raises:
AttributeError: If the name does not match any known properties.
Returns:
None
"""
if prop.name == 'kernel-size':
# Ensuring the kernel size is odd
if value%2 == 0:
self.kernel_size = value + 1
else:
self.kernel_size = value
else:
raise AttributeError(f"Unknown Property {prop.name}")
def do_set_caps(self, incaps: Gst.Caps, outcaps: Gst.Caps) -> bool:
"""
Extracts information about the width, height, format, framerate, pixel
aspect ratio, interlace mode, colorimetry, and chroma site from the
input caps.
Args:
incaps (Gst.Caps): The input capabilities (formats) of the pad.
outcaps (Gst.Caps): The output capabilities (formats) of the pad.
Returns:
bool
"""
struct = incaps.get_structure(0)
self.width = struct.get_int("width").value
self.height = struct.get_int("height").value
self.format = struct.get_string("format")
self.framerate = struct.get_fraction("framerate")
self.pixel_aspect_ratio = struct.get_fraction("pixel-aspect-ratio")
self.interlace_mode = struct.get_string("interlace-mode")
self.colorimetry = struct.get_string("colorimetry")
self.chroma_site = struct.get_string("chroma-site")
caps_string="\n"
caps_string += "#"*100
caps_string += f"\nFormat: {self.format}"
caps_string += f"\nHeight: {self.height}"
caps_string += f"\nWidth: {self.width}"
caps_string += f"\nFramerate: {self.framerate}"
caps_string += f"\nPixel Aspect Ratio: {self.pixel_aspect_ratio}"
caps_string += f"\nInterlace Mode: {self.interlace_mode}"
caps_string += f"\nColorimetry: {self.colorimetry}"
caps_string += f"\nChroma Site: {self.chroma_site}\n"
caps_string += "#"*100
Gst.info(caps_string)
return True
def do_transform_ip(self, gst_buffer: Gst.Buffer) -> Gst.FlowReturn:
"""
This method implements the functionality for blurring the input GstBuffer.
Changes can be made to the input GstBuffer directly (inplace) to
obtain the output GstBuffer.
Args:
gst_buffer (Gst.Buffer): The input buffer to be processed.
Returns:
Gst.FlowReturn.OK: If the buffer is successfully processed.
Gst.FlowReturn.ERROR: If mapping error is encountered.
"""
try:
with gst_buffer.map(Gst.MapFlags.READ | Gst.MapFlags.WRITE) as info:
# Fetch the RGB/BGR image
image = np.ndarray((self.height, self.width, 3), buffer=info.data, dtype=np.uint8)
# Blur RGB/BGR frame
image[:,:,:] = cv2.GaussianBlur(image, (self.kernel_size, self.kernel_size), 0)
return Gst.FlowReturn.OK
except Gst.MapError as e:
Gst.error("Mapping error: %s" % e)
return Gst.FlowReturn.ERROR
# Register the element factories and other features
# The value of this attribute should be a tuple consisting:
# - Factory-name for the plugin
# - Plugin Rank
# - Class that implements the element.
GObject.type_register(Blur)
__gstelementfactory__ = (
Blur.GST_PLUGIN_NAME,
Gst.Rank.PRIMARY,
Blur
)
```
The plugin can be used to **blur** the whole incoming buffer if used in the below pipeline:
`gst-launch-1.0 filesrc location=sample.mp4 ! qtdemux ! h264parse ! avdec_h264 ! videoconvert ! blur kernel-size=50 ! videoconvert ! autovideosink`
#### Expected Behavior:
The created **blur** plugin can be used as follows:
* inside a gst-launch pipeline.
* inside a C-based GStreamer app where a plugin element is created using gst_element_factory_make
* inside a Python-based GStreamer app where Gst.parse_launch(pipeline_str) is used to create the pipeline
* inside a Python-based GStreamer app where gst elements are created using Gst.ElementFactory.make. Then we add/link them to the pipeline.
#### Observed Behavior:
My plugin is working fine in the below scenario where the incoming video buffers are blurred:
* When the plugin is used in the gst-launch pipeline.
* When the plugin element is created using gst_element_factory_make inside a C-based GStreamer app.
* When the plugin is used inside a Python app where Gst.parse_launch(pipeline_str) is used to create the pipeline
However when i try to create the gst-python plugin **blur** using Gst.ElementFactory.make and add it to the pipeline then the plugin is not blurring the incoming buffer. Although no error was seen on the console seems like transform_ip function is not getting called.
Here is the code for my Python-based gstreamer app where i am creating and adding **blur** plugin into pipeline
```python
import sys
import gi
gi.require_version('Gst', '1.0')
from gi.repository import GLib, Gst
Gst.init(None)
def bus_call(bus, message, loop):
t = message.type
if t == Gst.MessageType.EOS:
sys.stdout.write("End-of-stream\n")
loop.quit()
elif t==Gst.MessageType.WARNING:
err, debug = message.parse_warning()
sys.stderr.write("Warning: %s: %s\n" % (err, debug))
elif t == Gst.MessageType.ERROR:
err, debug = message.parse_error()
sys.stderr.write("Error: %s: %s\n" % (err, debug))
loop.quit()
return True
def demuxer_pad_added(demuxer, pad, element):
"""
Link element to qtdemux dynamically
"""
if pad.name == 'video_0':
demuxer.link(element)
# Create a pipeline
pipeline = Gst.Pipeline()
# Create elements
source = Gst.ElementFactory.make("filesrc", "source")
source.set_property("location", "1.mp4")
qtdemux = Gst.ElementFactory.make("qtdemux", "qtdemux")
h264parse = Gst.ElementFactory.make("h264parse", "h264parse")
avdec_h264 = Gst.ElementFactory.make("avdec_h264", "avdec_h264")
videoconvert = Gst.ElementFactory.make("videoconvert", "videoconvert")
custom_plugin = Gst.ElementFactory.make("blur", "blur")
custom_plugin.set_property('kernel-size', 50)
videoconvert1 = Gst.ElementFactory.make("videoconvert", "videoconvert1")
sink = Gst.ElementFactory.make("autovideosink", "sink")
# Add elements to the pipeline
pipeline.add(source)
pipeline.add(qtdemux)
pipeline.add(h264parse)
pipeline.add(avdec_h264)
pipeline.add(videoconvert)
pipeline.add(custom_plugin)
pipeline.add(videoconvert1)
pipeline.add(sink)
# Link elements in the pipeline
if not source.link(qtdemux):
print("Failed to link source and qtdemux elements")
exit(1)
if not h264parse.link(avdec_h264):
print("Failed to link h264parse and avdec_h264 elements")
exit(1)
if not avdec_h264.link(videoconvert):
print("Failed to link avdec_h264 and videoconvert elements")
exit(1)
if not videoconvert.link(custom_plugin):
print("Failed to link videoconvert and custom_plugin elements")
exit(1)
if not custom_plugin.link(videoconvert1):
print("Failed to link custom_plugin and videoconvert1 elements")
exit(1)
if not videoconvert1.link(sink):
print("Failed to link videoconvert1 and sink elements")
exit(1)
qtdemux.connect("pad-added", demuxer_pad_added, h264parse)
# Start the main loop
loop = GLib.MainLoop()
bus = pipeline.get_bus()
bus.add_signal_watch()
bus.connect ("message", bus_call, loop)
pipeline.set_state(Gst.State.PLAYING)
try:
loop.run()
except KeyboardInterrupt:
pass
# Stop the pipeline and cleanup
pipeline.set_state(Gst.State.NULL)
```
The **blur** plugin is recognised by gst-inspect and works fine on gst-launch.
Debugging shows that when i create an element in the Python app _Gst.ElementFactory.make("blur", "blur")_ then only \__init_\_ & do_set_property methods are called. Methods do_set_caps & tranform_ip are not getting executed.
#### Setup
- **Operating System:** ubuntu 22.04
- **Device:** Computer
- **GStreamer Version: 1.20.3**
My doubt is why **blur (custom gst-python plugin)** is not working well with my Python-based gstreamer app, although it is working fine on gst-launch and some other scenarios i mentioned above.
###https://gitlab.freedesktop.org/gstreamer/orc/-/issues/55RFC: Refactor how accumulators are handled2024-01-18T13:36:50ZJorge ZapataRFC: Refactor how accumulators are handledCurrently in Orc, accumulators are handled as special opcodes. Such opcodes have the flag `ORC_STATIC_OPCODE_ACCUMULATOR` https://gitlab.freedesktop.org/gstreamer/orc/-/blob/main/orc/orcopcode.h#L28 and the current logic is:
1. The accu...Currently in Orc, accumulators are handled as special opcodes. Such opcodes have the flag `ORC_STATIC_OPCODE_ACCUMULATOR` https://gitlab.freedesktop.org/gstreamer/orc/-/blob/main/orc/orcopcode.h#L28 and the current logic is:
1. The accumulators always end up adding the current accumulator value to the actual opcode operation
2. In case a different kind of operation to accumulate is needed, a new opcode has to be created for that particular case
3. The initial value of the accumulators is always set to zero (through pxor, for example)
4. Given that the accumulator is added at every loop iteration, and the operation is always an addition, loop_shifts (partial data processing) are not a problem, adding zero does not alter the final value.
The opcodes `accw`, `accl` and `accsadubl` are the current accumulator opcodes.
My actual requirement is to implement a `maxf` accumulator, that is, do a `maxf` for an array of floats, accumulate the maximum value and keep it. The approach would be to do the same as point 2 above. Create a new opcode like `accmaxf` and change the conditions of point 3 and point 4 to handle partial data processing and initialize the data with `MINF` instead of 0.
Now, I think this will pollute Orc with more opcodes where the actual operation of the accumulation (the `maxf`) already exists. My proposal would be to just use regular opcodes as accumulator setters.
1. Add a new flag to mark opcodes that can accumulate, something like `ORC_STATIC_OPCODE_CAN_ACCUMULATE` (actually commutative operations).
2. Add a sanity check that when using an opcode into a destination variable that is an accumulator, force to make src arg1 also the same variable, i.e `a1 = a1 + s1`
3. Pass the initial value of the accumulator as part of the OrcExecutor, similar to the current situation of storing the value, but loading it too.
4. At each loop iteration (depending on the loop shift), correctly blend the initial accumulator value with the actual parameter to use. That means, if the initial value is 1 1 1 1 (double words on SSE, because of a multiplication) and the loop shift only holds 4 bytes of data (one double word), blend the value to be 1 1 1 X.
5. Out of every loop, re-use the same opcode to reduce the vector into a single value.
Of course all of that, maintaining compatibility with old `acc*` based opcodes. All of that is already implemented in https://gitlab.freedesktop.org/turran/orc/-/compare/main...accumulators?from_project_id=1360&straight=false with backwards compatibility and without breaking any test.
Please, let me know your thoughts.
PS: I sincerely think that a refactoring in Orc is needed, adding new features is complicated. I can manage that, but need the maintainers opinion on this topic. Maybe I should create another issue to discuss there?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3231rtpvrawdepay: Green lines at top of 1080p video feed, possible issue with imp...2024-01-17T16:40:35ZHarry Jonesrtpvrawdepay: Green lines at top of 1080p video feed, possible issue with implementation of RFC 4175I am trying to view an RTP over UDP stream from a proprietary camera using Gstreamer 1.16.3 (standard libs) on an Ubuntu Server 20.04 system. This is the pipeline I am using to do so:
`gst-launch-1.0 udpsrc address=239.192.1.1 port=5004...I am trying to view an RTP over UDP stream from a proprietary camera using Gstreamer 1.16.3 (standard libs) on an Ubuntu Server 20.04 system. This is the pipeline I am using to do so:
`gst-launch-1.0 udpsrc address=239.192.1.1 port=5004 buffer-size=26214400 auto-multicast=true multicast-iface="ens2f0np0" caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=YCbCr-4:2:2, depth=(string)8, width=(string)1920, height=(string)1080, payload=(int)98, a-framerate=25" ! rtpvrawdepay ! queue ! videoconvert ! autovideosink`
The stream from camera is 1920x1080, YCbCr-4:2:2 encoded and uncompressed. Using wireshark to examine the payload headers of incoming packets, I have confirmed that it adheres to RFC 4175. I.e. as per SMPTE 274M, valid scan lines are from scan line 42 through 1121 (https://datatracker.ietf.org/doc/html/rfc4175). Below is a snippet of the wireshark output showing the rollover from last packet of old frame to first packet of new frame (note SRD row number):
| No. | Time | Source | Destination | Protocol | Length | Identification | SRD Row Number | Info
| --- | -------- | -------------- | ----------- | -------- | ------ | -------------- | -------------- | --------------------------------------------------------------- |
635868 | 7.75517 | 192.168.204.10 | 239.192.1.1 | RTP | 1342 | 0xdb0f (56079) | 1121 | PT=DynamicRTP-Type-98, SSRC=0x0, Seq=6833, Time=2235524549, Mark |
635870 | 7.757292 | 192.168.204.10 | 239.192.1.1 | RTP | 1342 | 0xdb10 (56080) | 42 | PT=DynamicRTP-Type-98, SSRC=0x0, Seq=6834, Time=2235528149 |
When I view the video feed I'm seeing a block of 42 green lines at the top with 42 lines of the actual stream cut from the bottom, as per the screenshot below:
![greenlines](/uploads/f86037a3cd373f9d779eec3d4333065a/greenlines.png)
I'm uncertain whether this is a GStreamer bug or the pipeline I'm using. Though, when viewing the feed through a proprietary windows application that correctly implements SMPTE 274M, the green bars do not appear. This suggests to me that GStreamer (or more specifically the rtpvrawdepay element) isn't handling the stream as per the standard and is displaying invalid scan lines 0 through 41.
Any help or links to similar problems would be greatly appreciated!https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3229GstQueue ignores min-threshold-* when serialised event is sent downstream2024-01-28T14:19:47ZElliot LevinGstQueue ignores min-threshold-* when serialised event is sent downstream### Describe your issue
When sending custom events down the pipeline, GstQueue will flush all queued buffers, ignoring min-threshold-* properties.
This behaviour seems to be traced to this original bug report: https://bugzilla.gnome.org...### Describe your issue
When sending custom events down the pipeline, GstQueue will flush all queued buffers, ignoring min-threshold-* properties.
This behaviour seems to be traced to this original bug report: https://bugzilla.gnome.org/show_bug.cgi?id=762875
Which led to this change: https://github.com/GStreamer/gstreamer/commit/23b32d56008d364257d1d186da52650cb4475aa4
<!-- For any GStreamer usage questions or application development support
please head over to our new GStreamer Discourse forum at
https://discourse.gstreamer.org/ instead, or find us on
the #gstreamer IRC channel on https://www.oftc.net -->
#### Expected Behavior
That the queue would respect min-threshold-* for both buffers and serialized downstream events.
#### Observed Behavior
All buffers and events flush immediately.
#### Setup
- **Operating System:** Debian
- **GStreamer Version:** 1.22.0
- **Command line:** N/A
### Steps to reproduce the bug
<!-- please fill in exact steps which reproduce the bug on your system, for example: -->
1. create `queue min-threshold-time=3000000000`
2. enqueue buffers
3. enqueue custom downstream event
### How reproducible is the bug?
Always
### Screenshots if relevant
### Solutions you have tried
### Related non-duplicate issues
### Additional Information
<!-- Any other information such as logs. Make use of <details> for long output -->https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/481s3sink: split files like splitmuxsink2024-01-29T11:09:55Zinhyeok Kims3sink: split files like splitmuxsinkHi, I am a newbie on gstreamer.
Does someone have some plan to support file splitting like splitmuxsink on s3sink?Hi, I am a newbie on gstreamer.
Does someone have some plan to support file splitting like splitmuxsink on s3sink?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3227v4l2h264dec seems too restrictive when it comes to colorimetry caps2024-01-16T11:12:01ZMartin Dørumv4l2h264dec seems too restrictive when it comes to colorimetry capsI have a pipeline which is more or less `appsrc ! h264parse ! v4l2h264dec ! appsink`. It didn't work with certain h264 streams, and I found out it's due to colorimetry caps.
When I query the v4l2h264dec element's sink pad's caps after s...I have a pipeline which is more or less `appsrc ! h264parse ! v4l2h264dec ! appsink`. It didn't work with certain h264 streams, and I found out it's due to colorimetry caps.
When I query the v4l2h264dec element's sink pad's caps after setting the pipeline's state to playing, this is what it reports: `video/x-h264, width=(int)[ 64, 1920 ], height=(int)[ 64, 1920 ], framerate=(fraction)[ 0/1, 2147483647/1 ], stream-format=(string)byte-stream, alignment=(string)au, level=(string){ 1, 1b, 1.1, 1.2, 1.3, 2, 2.1, 2.2, 3, 3.1, 3.2, 4, 4.1, 4.2, 5, 5.1 }, profile=(string){ baseline, constrained-baseline, main, high, stereo-high, multiview-high }, colorimetry=(string){ bt601, smpte240m, bt709, 1:3:5:1, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }, parsed=(boolean)true`.
The relevant part here is the colorimetry: `colorimetry=(string){ bt601, smpte240m, bt709, 1:3:5:1, 2:4:5:2, 2:4:5:3, 1:4:7:1, 2:4:7:1, 2:4:12:8, bt2020, 2:0:0:0 }`. When playing the problematic h264 streams, the input caps contains `colorimetry=(string)2:4:16:3`, which isn't in the list of supported colorimetries, which causes the problem.
There's a couple of related issues here:
1. Why does v4l2h264dec constrain the colorimetries at all? Doesn't it just output the color values which were in the h264 stream? Isn't it the responsibility of the consumer of the decoded video frames to handle the colorimetry?
2. The colorimetry tuple `2:4:16:3` means range=[16..235], matrix=BT601, transfer=BT601, primaries=BT470BG. From what I can tell, this is identical to bt601. Shouldn't v4l2h264dec therefore also claim to support `2:4:16:3` when it claims to support bt601? Alternatively, shouldn't gstreamer treat them as equivalent when checking if pads are compatible?
And for my own sake: currently, I work around this by hard-coding `caps=video/x-h264,colorimetry=bt601` in the appsrc. This seems to work, but is it an alright workaround? Or does v4l2h264dec actually care about colorimetry in some way which could make this workaround problematic?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3226Interaction between rtpbin and rtpst2022-1-fec does not work when using the r...2024-01-16T11:20:53ZMorten Olsen LysgaardInteraction between rtpbin and rtpst2022-1-fec does not work when using the rtpbin `request-fec-decoder` signal.CC @meh
GStreamers `ulpfec` element is designed to be compatible with the "broken" FEC in Googles-WebRTC implementation, read here for reference: https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/581
Because of this, i...CC @meh
GStreamers `ulpfec` element is designed to be compatible with the "broken" FEC in Googles-WebRTC implementation, read here for reference: https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/issues/581
Because of this, it is expecting to receive a single combined stream of RTP packets with both FEC and media packets combined.
The code for `request-fec-decoder` signal on `rtpbin` was originally designed for the `ulpfec`.
Because of that, the fec-element returned by the `request-fec-decoder` signal-handler is plugged in in a spot that works for `ulpfec`.
Contrary to `ulpfec`, `rtpst2022-1-fec` is implemented so that it works in the general case. It expects to receive 3 independent RTP-streams, 1 media and 2 FEC.
If you want to use `rtpst2022-1-fec` with `rtpbin`, and use the `request-fec-decoder` signal to instantiate it for `rtpbin`, it will be plugged in the wrong place inside `rtpbin`, and because of this, it will not work. Instead, using `rtpst2022-1-fec` **requires** that one uses the `fec-decoders` property of the `rtpbin`, as then it will be plugged in by some different code-path inside rtpbin which places it is the correct spot.
See the attached scripts that demonstrate the problem.
* [send.sh](/uploads/528c237302a01be213dd4c15dc0df4d1/send.sh) sends a test rtp stream with rtpst2022-1 fec encoding and H264
* [receive.sh](/uploads/eed21298c3af5a2a382a4bd93a1d7982/receive.sh) receives the stream and displays it using gst-launch-1.0 syntax. Works well.
* [ receive.py](/uploads/40017378f8bc18f1df56a0272cf3c53a/receive.py) receives the stream and displays it. Allows for setting a bool wether to use the `request-fec-decoder` signal or the `fec-decoders` factory string. When using the signal, the receiver breaks and does not show any video.
Matrix chat for reference of when this issue was discovered: https://matrix.to/#/!rFXJkaEjvUdqUHMPew:gstreamer.org/$f6MnvwIYygfCdCNIge66EZwy5_GkwcfH8NGkDynUlmY?via=gstreamer.org&via=matrix.org&via=gnome.orhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3224Buffers Stuck When Pausing Live Source with Custom Sink2024-01-29T22:54:11ZStefan KieszkowskiBuffers Stuck When Pausing Live Source with Custom SinkHello, I'm having troubles pausing then playing the live source element of a pipeline. The issue is that there are frames that don't make it to the sink when I pause the live source element until after I play it again. Here is my pipelin...Hello, I'm having troubles pausing then playing the live source element of a pipeline. The issue is that there are frames that don't make it to the sink when I pause the live source element until after I play it again. Here is my pipeline sans caps filters:
`autovideosrc ! x264enc ! customSink`
When I change the `autovideosrc`'s state PLAYING->PAUSED, the pipeline immediately stops, and upon setting state back PAUSED->PLAYING, the first few frames that `customSink` receives are the old ones from just before pausing the element.
I've tried several things including ASYNC handling and flushing, and also came across resources suggesting it may involve pull/push configuration of elements or the pipeline clock settings. I've experimented with just about every possible configuration and order I could come up with, and still not getting the desired behavior. Any guidance would be greatly appreciated.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3223dashsink: Why there is no implementation for segment timeline?2024-01-15T10:12:44ZJasur Dovurovdashsink: Why there is no implementation for segment timeline?There are some implementation files for Segment Timeline Representation [gstmpdsegmenttimelinenode.h](ext/dash/gstmpdsegmenttimelinenode.h) and [gstmpdsegmenttimelinenode.c](ext/dash/gstmpdsegmenttimelinenode.c), but it has not been impl...There are some implementation files for Segment Timeline Representation [gstmpdsegmenttimelinenode.h](ext/dash/gstmpdsegmenttimelinenode.h) and [gstmpdsegmenttimelinenode.c](ext/dash/gstmpdsegmenttimelinenode.c), but it has not been implemented inside struct [_GstMPDRepresentationNode](https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/blob/discontinued-for-monorepo/ext/dash/gstmpdrepresentationnode.h#L35) and also not implemented in [gstdashsink.c](ext/dash/gstdashsink.c) and there is only boolean choice `use-segment-list` to select SegmentList or SegmentTemplate. Are there any plans to continue to integrate this feature to the dashsink?https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3222Incorrect RTSP SETUP Command Format in GStreamer 1.22.02024-02-01T19:05:21ZLuc BusquinIncorrect RTSP SETUP Command Format in GStreamer 1.22.0**Description:**
With GStreamer version 1.22.0, I am encountering an issue when establishing an RTSP connection to an IP camera. The SETUP command in the RTSP handshake is missing essential credentials, preventing the stream from being ...**Description:**
With GStreamer version 1.22.0, I am encountering an issue when establishing an RTSP connection to an IP camera. The SETUP command in the RTSP handshake is missing essential credentials, preventing the stream from being established. While other commands, such as OPTIONS and DESCRIBE, are formatted correctly, the SETUP command is incomplete compared to previous versions and other RTSP clients like VLC.
This issue was not present in GStreamer version 1.14.
**Expected SETUP Command (GStreamer 1.14 and VLC):**
`SETUP rtsp://192.168.42.10:554/user=admin&password=&channel=1&stream=0.sdp/trackID=3 RTSP/1.0`
**Actual SETUP Command in GStreamer 1.22.0:**
`SETUP rtsp://192.168.42.10:554/trackID=3 RTSP/1.0`
#### Environment
- OS: Debian 12
- Device: Raspberry Pi 5
- GStreamer Version: 1.22.0
- Pipeline Used: `gst-launch-1.0 rtspsrc location='rtsp://192.168.42.10:554/user=admin&password=&channel=1&stream=0.sdp'`
### Steps to Reproduce
1. Open a terminal.
2. Execute the command: `gst-launch-1.0 rtspsrc location='rtsp://192.168.42.10:554/user=admin&password=&channel=1&stream=0.sdp'`.
3. Use Wireshark to observe the RTSP SETUP command.
### Frequency of Occurrence
The bug occurs consistently and is reproducible every time the above command is executed.
### Impact
The malformed SETUP command results in the failure to establish an RTSP stream from the IP camera, which is a critical functionality for applications relying on real-time video feeds.
### Additional Information
- The RTSP stream from the IP camera works as expected when accessed using VLC, which suggests that the camera's RTSP server is functioning correctly.
- The issue seems to be specific to the GStreamer version 1.22.0, as earlier versions (e.g., 1.14) do not exhibit this behavior.
- Network traffic analysis via Wireshark confirms the discrepancy in the SETUP command between GStreamer versions.https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3220webrtcdsp: echo cancellation issue with stereo signal2024-01-27T16:52:46ZFabien Danieauwebrtcdsp: echo cancellation issue with stereo signal### Describe your issue
The echo cancellation feature of webrtcdsp breaks the stereo of an audio signal.
Discussed here: https://discourse.gstreamer.org/t/webrtcdsp-stereo-echo-cancellation/739/2
#### Expected Behavior
Captured stereo ...### Describe your issue
The echo cancellation feature of webrtcdsp breaks the stereo of an audio signal.
Discussed here: https://discourse.gstreamer.org/t/webrtcdsp-stereo-echo-cancellation/739/2
#### Expected Behavior
Captured stereo should be played back as is by the speakers.
`gst-launch-1.0 alsasrc ! webrtcdsp echo-cancel=true ! webrtcechoprobe ! audioconvert ! alsasink`
#### Observed Behavior
Only one channel is output. Goes back to normal with echo-cancel=false.
#### Setup
- **Operating System:** Ubuntu 22.04
- **Device:** Computer
- **GStreamer Version:** 1.23.1 (commit 79ffe4f4)
### Steps to reproduce the bug
This pipeline sends two different signals on the left and right channels. Using echo-cancel=true or false, the stereo is degraded.
```
gst-launch-1.0 \
audiotestsrc volume=0.1 ! capsfilter caps=audio/x-raw,format=S16LE,rate=8000,channels=1,channel-mask=\(bitmask\)0x1 name=left \
audiotestsrc freq=1320 volume=0.1 ! capsfilter caps=audio/x-raw,format=S16LE,rate=8000,channels=1,channel-mask=\(bitmask\)0x2 name=right \
interleave name=i left. ! i. right. ! i. \
i. ! webrtcdsp echo-cancel=true ! webrtcechoprobe \
! audioconvert ! audioresample ! alsasink
```
### How reproducible is the bug?
Systematic
### Additional Information
Discussed with @fengalin. The regression seems to have appeared since https://gitlab.freedesktop.org/gstreamer/gstreamer/-/commit/d5755744c3e2b70e9f04704ae9d18b928d9fa456https://gitlab.freedesktop.org/gstreamer/cerbero/-/issues/461Error Android. Generated package with qt5 support fails to link to qmlglsink2024-01-15T17:45:31ZJose Diego Avila RuizError Android. Generated package with qt5 support fails to link to qmlglsinkHello to everyone.
I´ve been strugling for many days and I can not find a solution. I hope you could help me. I've been using gstreamer on Android for a long time. I would like to render a video on a qml item. So i found **qmlglsink**.
...Hello to everyone.
I´ve been strugling for many days and I can not find a solution. I hope you could help me. I've been using gstreamer on Android for a long time. I would like to render a video on a qml item. So i found **qmlglsink**.
**Relevant note:** I'm writing a beginners guide of how to use gstreamer on Android with qt support Audio / Video. By using Android Activity ( Native window ) And qmlglsink ( When we find a working solution )
First at all I build the package with te following command :
**./cerbero-uninstalled -v qt5 -c config/cross-android-universal.cbc package gstreamer-1.0**
( BTW I set the qmake path to : /opt/Qt/5.15.2/android/bin )
Then I uncompress the generated tar package. And I link it to my cmake...
At this point I have 2 big questions:
1 - The ndk path should be pointing to: **/cerbero/build/android-ndk-25** or my "normal" used ndk ( In the 2 ways i got the same error)
2 - The second generated tar package ( runtime ) . Whats its purpose? Should I Use it?
**CMAKE PARAMS**
-GNinja
-DCMAKE_BUILD_TYPE:STRING=Debug
-DCMAKE_PROJECT_INCLUDE_BEFORE:PATH=%{IDE:ResourcePath}/package-manager/auto-setup.cmake
-DQT_QMAKE_EXECUTABLE:STRING=%{Qt:qmakeExecutable}
-DCMAKE_PREFIX_PATH:STRING=%{Qt:QT_INSTALL_PREFIX}
-DCMAKE_C_COMPILER:STRING=%{Compiler:Executable:C}
-DCMAKE_CXX_COMPILER:STRING=%{Compiler:Executable:Cxx}
-DANDROID_NATIVE_API_LEVEL:STRING=21
-DANDROID_NDK:PATH=/home/diego/Android/Sdk/ndk/21.3.6528147
-DCMAKE_TOOLCHAIN_FILE:PATH=/home/diego/Android/Sdk/ndk/21.3.6528147/build/cmake/android.toolchain.cmake
-DANDROID_ABI:STRING=arm64-v8a
-DANDROID_STL:STRING=c++_shared
-DCMAKE_FIND_ROOT_PATH:PATH=%{Qt:QT_INSTALL_PREFIX}
-DANDROID_SDK:PATH=/home/diego/Android/Sdk
**CMAKE**
cmake_minimum_required(VERSION 3.5)
project(TestAndroidVideo LANGUAGES CXX)
set(CMAKE_INCLUDE_CURRENT_DIR ON)
set(CMAKE_AUTOUIC ON)
set(CMAKE_AUTOMOC ON)
set(CMAKE_AUTORCC ON)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
message(STATUS ${ANDROID_ABI} )
message(STATUS ${ANDROID_BUILD_ABI_arm64-v8a})
if ( ANDROID_ABI STREQUAL "armeabi-v7a" )
set(GSTREAMER_ARCH_FOLDER armv7)
elseif( ANDROID_ABI STREQUAL "arm64-v8a" )
set(GSTREAMER_ARCH_FOLDER arm64)
elseif( ANDROID_ABI STREQUAL "x86" )
set(GSTREAMER_ARCH_FOLDER x86)
endif()
message(STATUS "GSTREAMER_ARCH_FOLDER: " ${GSTREAMER_ARCH_FOLDER})
set(GST_ROOT_ANDROID "/home/diego/Desktop/cerbero/build/dist/android_universal/${GSTREAMER_ARCH_FOLDER}")
message(STATUS "GST_ROOT_ANDROID: " ${GST_ROOT_ANDROID})
set(ANDROID_PACKAGE_SOURCE_DIR "${CMAKE_CURRENT_SOURCE_DIR}/android")
set(LIB_GSTREAMER_ANDROID "${ANDROID_PACKAGE_SOURCE_DIR}/libs/${ANDROID_ABI}/libgstreamer_android.so")
message(STATUS "LIB_GSTREAMER_ANDROID: " ${LIB_GSTREAMER_ANDROID})
add_custom_command(OUTPUT ${LIB_GSTREAMER_ANDROID}
COMMAND "${ANDROID_PACKAGE_SOURCE_DIR}/build_gstreamer.sh" "${ANDROID_PACKAGE_SOURCE_DIR}" --argument )
add_custom_target(gstreamer_lib ALL DEPENDS ${LIB_GSTREAMER_ANDROID})
set(ANDROID_EXTRA_LIBS ${LIB_GSTREAMER_ANDROID})
set(DISTFILES "${ANDROID_PACKAGE_SOURCE_DIR}/AndroidManifest.xml"
"${ANDROID_PACKAGE_SOURCE_DIR}/build.gradle"
"${ANDROID_PACKAGE_SOURCE_DIR}/res/values/libs.xml"
"${ANDROID_PACKAGE_SOURCE_DIR}/src/com/atr/amc/SurfaceViewManager.java")
find_package(Qt5 COMPONENTS Core Quick AndroidExtras REQUIRED)
file(GLOB_RECURSE HEADERS "*.h")
file(GLOB_RECURSE SOURCES "*.cpp")
set(RESOURCES "qml.qrc" )
add_library(TestAndroidVideo SHARED ${SOURCES} ${HEADERS} ${RESOURCES} )
target_link_libraries(TestAndroidVideo PRIVATE ${LIB_GSTREAMER_ANDROID} android)
target_include_directories(TestAndroidVideo PRIVATE
${GST_ROOT_ANDROID}/include/gstreamer-1.0
${GST_ROOT_ANDROID}/include/glib-2.0
${GST_ROOT_ANDROID}/lib/glib-2.0/include)
add_executable(TestAndroidVideo ${SOURCES} ${HEADERS} ${RESOURCES} )
target_compile_definitions(TestAndroidVideo PRIVATE $<$<OR:$<CONFIG:Debug>,$<CONFIG:RelWithDebInfo>>:QT_QML_DEBUG>)
target_link_libraries(TestAndroidVideo PRIVATE Qt5::Core Qt5::Quick Qt5::AndroidExtras)
**BUILD GSTREAMER SH**
echo $@;
. ~/.bashrc
cd $1
ndk-build
**Android.MK**
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := TestAndroidVideo
LOCAL_SHARED_LIBRARIES := gstreamer_android
LOCAL_LDLIBS := -llog
include $(BUILD_SHARED_LIBRARY)
ifndef GSTREAMER_ROOT_ANDROID
$(error GSTREAMER_ROOT_ANDROID is not defined!)
endif
ifeq ($(TARGET_ARCH_ABI),armeabi)
GSTREAMER_ROOT := $(GSTREAMER_ROOT_ANDROID)/arm
else ifeq ($(TARGET_ARCH_ABI),armeabi-v7a)
GSTREAMER_ROOT := $(GSTREAMER_ROOT_ANDROID)/armv7
else ifeq ($(TARGET_ARCH_ABI),arm64-v8a)
GSTREAMER_ROOT := $(GSTREAMER_ROOT_ANDROID)/arm64
else ifeq ($(TARGET_ARCH_ABI),x86)
GSTREAMER_ROOT := $(GSTREAMER_ROOT_ANDROID)/x86
else ifeq ($(TARGET_ARCH_ABI),x86_64)
GSTREAMER_ROOT := $(GSTREAMER_ROOT_ANDROID)/x86_64
else
$(error Target arch ABI not supported: $(TARGET_ARCH_ABI))
endif
GSTREAMER_NDK_BUILD_PATH := $(GSTREAMER_ROOT)/share/gst-android/ndk-build/
include $(GSTREAMER_NDK_BUILD_PATH)/plugins.mk
GSTREAMER_PLUGINS := $(GSTREAMER_PLUGINS_CORE) $(GSTREAMER_PLUGINS_QT5)
GSTREAMER_EXTRA_LIBS := -liconv
GSTREAMER_EXTRA_DEPS := gstreamer-controller-1.0 gstreamer-video-1.0 gstreamer-base-1.0 gstreamer-pbutils-1.0 openssl
include $(GSTREAMER_NDK_BUILD_PATH)/gstreamer-1.0.mkhttps://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3218webrtcbin: set-description implementation is not spec-compliant2024-01-23T08:30:16ZPhilippe Normandwebrtcbin: set-description implementation is not spec-compliantOur implementation of https://w3c.github.io/webrtc-pc/#set-the-session-description doesn't look correct. For instance, we apply step 6.5 (https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/0d04660c5dca53096db1711d7925e1920685b137/...Our implementation of https://w3c.github.io/webrtc-pc/#set-the-session-description doesn't look correct. For instance, we apply step 6.5 (https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/0d04660c5dca53096db1711d7925e1920685b137/subprojects/gst-plugins-bad/ext/webrtc/gstwebrtcbin.c#L6454) before step 6.4 (https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/0d04660c5dca53096db1711d7925e1920685b137/subprojects/gst-plugins-bad/ext/webrtc/gstwebrtcbin.c#L6595)
Also, an addition to the spec from a couple years ago, calling `set-local-description` without description should either use an internally generated offer, or answer, depending on the signaling state. Currently we reject this case (bad input).https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3216gl/glx: gst_gl_context_glx_activate causes leaks2024-02-06T13:39:05ZRobert Madergl/glx: gst_gl_context_glx_activate causes leaksFrom https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5727:
Adding passtrough support to `glcolorconvert` causes leaks on glx. There are [various tests blocklisted for valgrind](https://gitlab.freedesktop.org/gstreame...From https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5727:
Adding passtrough support to `glcolorconvert` causes leaks on glx. There are [various tests blocklisted for valgrind](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/blob/main/subprojects/gst-devtools/validate/launcher/testsuites/check.py?ref_type=heads#L92-95) already mentioning reasons like "driver leaks" but not linking an issue, so I assume this is caused by the MR and figured it would make sense to create this issue, possibly allowing to remove a bunch of valgrind blocklist entries once fixed.
<details><summary>valgrind summary</summary>
Running suite(s): autovideoconvert
0%: Checks: 1, Failures: 0, Errors: 1
../subprojects/gst-plugins-bad/tests/check/elements/autovideoconvert.c:89:E:general:test_autovideoconvert_videoconvert:0: (after this point) Early exit with return value 20
Check suite autovideoconvert ran in 13.314s (tests failed: 1)
**Duration**: 17.789213180541992
## test_autovideoconvert_videoconvert.valgrind:
```
==17399== Memcheck, a memory error detector
==17399== Copyright (C) 2002-2017, and GNU GPL'd, by Julian Seward et al.
==17399== Using Valgrind-3.18.1 and LibVEX; rerun with -h for copyright info
==17399== Command: /builds/rmader/gstreamer/build/subprojects/gst-plugins-bad/tests/check/elements_autovideoconvert
==17399== Parent PID: 14558
==17399==
==17501==
==17501== HEAP SUMMARY:
==17501== in use at exit: 1,279,683 bytes in 8,547 blocks
==17501== total heap usage: 93,048 allocs, 84,501 frees, 73,210,986 bytes allocated
==17501==
==17501== 5,355 bytes in 255 blocks are definitely lost in loss record 4,227 of 4,250
==17501== at 0x484286F: malloc (vg_replace_malloc.c:381)
==17501== by 0x4C0114E: strdup (strdup.c:42)
==17501== by 0x9B73781: ???
==17501== by 0x9B73524: ???
==17501== by 0x89DFE91: FixupDispatchTable (GLdispatch.c:257)
==17501== by 0x89E0C1F: UnknownInlinedFun (GLdispatch.c:587)
==17501== by 0x89E0C1F: __glDispatchMakeCurrent (GLdispatch.c:555)
==17501== by 0x8902CD1: InternalMakeCurrentDispatch (libglx.c:921)
==17501== by 0x890711A: CommonMakeCurrent (libglx.c:1074)
==17501== by 0x865883A: gst_gl_context_glx_activate (gstglcontext_glx.c:878)
==17501== by 0x86230CC: gst_gl_context_activate (gstglcontext.c:777)
==17501== by 0x862513C: gst_gl_context_create_thread (gstglcontext.c:1335)
==17501== by 0x4A5DC41: g_thread_proxy (gthread.c:826)
==17501== by 0x4F532A4: start_thread (pthread_create.c:481)
==17501== by 0x4C71322: clone (clone.S:95)
==17501==
{
<insert_a_suppression_name_here>
Memcheck:Leak
match-leak-kinds: definite
fun:malloc
fun:strdup
obj:*
obj:*
fun:FixupDispatchTable
fun:UnknownInlinedFun
fun:__glDispatchMakeCurrent
fun:InternalMakeCurrentDispatch
fun:CommonMakeCurrent
fun:gst_gl_context_glx_activate
fun:gst_gl_context_activate
fun:gst_gl_context_create_thread
fun:g_thread_proxy
fun:start_thread
fun:clone
}
==17501== LEAK SUMMARY:
==17501== definitely lost: 5,355 bytes in 255 blocks
==17501== indirectly lost: 0 bytes in 0 blocks
==17501== possibly lost: 0 bytes in 0 blocks
==17501== still reachable: 264,722 bytes in 2,714 blocks
==17501== suppressed: 989,782 bytes in 5,403 blocks
==17501== Reachable blocks (those to which a pointer was found) are not shown.
==17501== To see them, rerun with: --leak-check=full --show-leak-kinds=all
==17501==
==17501== For lists of detected and suppressed errors, rerun with: -s
==17501== ERROR SUMMARY: 1 errors from 1 contexts (suppressed: 2 from 2)
==17399==
==17399== HEAP SUMMARY:
==17399== in use at exit: 655,371 bytes in 1,595 blocks
==17399== total heap usage: 59,723 allocs, 58,128 frees, 35,552,145 bytes allocated
==17399==
==17399== LEAK SUMMARY:
==17399== definitely lost: 0 bytes in 0 blocks
==17399== indirectly lost: 0 bytes in 0 blocks
==17399== possibly lost: 0 bytes in 0 blocks
==17399== still reachable: 96 bytes in 2 blocks
==17399== suppressed: 648,459 bytes in 1,526 blocks
==17399== Reachable blocks (those to which a pointer was found) are not shown.
==17399== To see them, rerun with: --leak-check=full --show-leak-kinds=all
==17399==
==17399== For lists of detected and suppressed errors, rerun with: -s
==17399== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 1 from 1)
</details>https://gitlab.freedesktop.org/gstreamer/orc/-/issues/54Document on how to contribute a new target2024-01-22T10:12:03ZJorge ZapataDocument on how to contribute a new targetNow that ORC has a CONTRIBUTING.md file, and that a lot of excellent work has been done to add AVX support, it will be good to enhance the document with a brief explanation of what is required to add a new target, including the modificat...Now that ORC has a CONTRIBUTING.md file, and that a lot of excellent work has been done to add AVX support, it will be good to enhance the document with a brief explanation of what is required to add a new target, including the modification of generate_xml_table.c. This will help us update the target table at https://gstreamer.pages.freedesktop.org/orc/docs/latest/orc-opcodes.html. If a target has poor coverage, it should be include too to know where to improve ORC.