Discussion about the new max quantum
I just updated to pipewire 0.3.44 and found that I got xruns in my music projects that weren't there before. Thought this was a bug, but looking at the changelog:
The default maximum latency was reduced from 170ms to 42ms.
So default max quantum got reduced from 8192 to 2048. Well, maybe this is fine for some light desktop usage and normal content consumption. But when you have a lot going on in the graph with plugins loaded or it comes to mixing music at some point you want higher values. For example I have a modern system and despite that, working on a project in mixing stage with anything below 4096 quantum just leads to xruns and makes it totally unusable. (What do people on older systems even do?) For that it is really useful to be able to use higher buffer sizes. There is a reason 4096 and 8192 exist in jack and apps, like Ardour, should be allowed to make use of them. To forbid this seems like a step back.
But sure there is a reason behind this decision that I try to understand. The second sentence from the changelog seems really interesting:
This should improve overall latency for application that ask for a large latency, such as notifications.
Ehm. Let me make another version of this sentence:
"This should improve overall latency and increase xruns for application that justifiably ask for a large latency, such as DAWs."
Anyway, looking at the last part:
for application that ask for a large latency, such as notifications
If an application asks for a higher latency, this could be a reasonable request, why should pipewire not allow this? It shouldn't be pipewires job to parent applications and force them to act in some way. I think it should be the applications job to request its appropriate latency. Some need lower latency, so they should request and get that. Others need higher latency, and they should get that too. We need to adapt in both directions. Seems like a perfectly fine logic to me.
To intervene in that feels like a bad workaround. If applications / notifications set their latency wrong, it is their bug. Sure we can partly get around that, but only at the cost of breaking other applications. So is that really worth it? I think such a workaround should not exist in that form. It isn't really an improvement, now it is only differently broken. Great sadness.
Also, even if apps/notifications don't set their latency: is it really that big of a problem? Having max quantum of 8192 just like before leads to a max latency (at 48kHz) of 170ms. This is still fast and should be enough for most notifications (+ other things you don't have to immediately react to). Do you think that users really notice that delay and hearing the sound for a new chat message only a little later is is a dealbreaker?
So my suggestion: Revert that change, allow buffer sizes up to 8192 again and if something has an inappropriate high latency report it to the developer of that app.
Just another thought: In case of old not maintained pulse applications this lowered max quantum could be applied only under circumstances. Maybe only limit pulse apps or add affected apps to quirks db.
Of course, this can be discussed :)