You need to go back to the roots of open source. Fork it, merge your two changes, remove 90% of code you don't need, rename it, write article about speed up in the new successor vs the old thing.
It is a rite of passage. Meet Jellypin, my fork that only allows watching media with subtitles
Forks don't have to be hostile. A perfectly reasonable way to react to an overwhelmed maintainer is just to do a friendly fork. Keep the original name, attribution, git history etc, update the README and start acting as a trustworthy lieutenant. You can review stuck PRs and merge them into your own branch, whilst also merging with upstream master. After a while if you seem to be making good calls the original maintainer can do a bulk merge from your branch to bring in many PRs at once, and maybe add you to the repository.
Check out my fork, Jellyden(iro). It’s the best way to watch Heat 2. All the media selection garbage is removed for a streamlined Heat 2 experience, because why would you want to watch anything else when you could be watching Heat 2 instead.
Now all I have to do is pull both your forks and create my own so I can add one more feature. This is the future!
Here I was, naively hoping for a fork that would only allow watching Heat 2 with subtitles... Welp, time to put these tokens to good use
It's worth asking "if AI is so great for software development, won't that make it dramatically easier for people to maintain their own forks of software?"
(I suspect the answer ends up being no, but the reasons could be interesting)
I'm curious why you think the answer would be no. I've had some success with resolving complex merges with GPT 5.4, and it seems obvious enough that AI is a good solution for maintainers who don't have anyone they can trust to take over the project whilst also needing to boost throughput.
You jest, but I think there is kernel of truth here. I do think people should be doing more (friendly) forks instead of funneling everything through upstream.
Ultimately if the new contributor brings in others to the project to also review and progress the project then it will quickly outpace the development on jellyfin and become the successful fork. No maintainer can cope with the workload of something like jellyfin and if they wont assign maintainers there isn't much else to be done.
The key to the success is dealing with the outstanding merges by bringing maintainers onboard that are trying to contribute, build up the team and then the merges will get processed a lot faster.
So this is exactly what's unintuitive about queues, an analogy would be car lanes. Intuition might lead you to conclude that if a 2 lane road has traffic constantly going to 4 lanes will solve the traffic. But this is not true. Many people that would have used the road might have been using public transport or just decided not to commute or stay inside normally will join the traffic until it once again equilibriates. Adding more maintainers without addressing the core problems of the queue won't lead to success
If you only focus on "solving the traffic" then you're right, adding more lanes ultimately just leads to more lanes being full. But the overall throughput is much higher! We need more holistic solutions, to be sure, but I hope no one thinks that means I-5 around LA could just be 2 lanes of traffic because they'll be full of traffic either way.
Does induced demand apply to open source maintaining? What would be the mechanism for that?
For traffic, more users note that the highway is easier to drive on and come over. Would people notice development speeding up and start adding more issues?