Hello, I would like to comment and bring in another email I sent recently. From https://lists.archlinux.org/hyperkitty/list/aur-general@lists.archlinux.org/...:
I think something which does need to be brought up is a feature to mass delete package and their subpackages, in a dependency style way. If some software has hundreds of plugins packaged for it, and the main package is deleted, then (as far as I am aware), all the plugins need to be flagged for deletion which then means the requests get flooded. Maybe there is a feature which exists already, but I am not aware of it (maybe a package maintainer can comment on this?).
I think this point I made here is very relevant, there is some legitimate use for bulk deletion requests, when you are blanket deleting a large number of similar packages. However, I feel this would also promote the deletion of useful packages, simply for being orphaned, which I also discussed in the thread I linked above. However I do feel this feature would massively fix the ML spam, and no offense to zoorat, marsseed and anyone else who is attempting to clean up the AUR, but it is painful to look through requests as a normal user when you fill it with 100s of requests every day.
I think we could say these requests are clearly filed by some automated process that's triggered on any package with arm or aarch64 keywords. But the process catches wr ong targets and these requests would result in either wrongly deleted packages or increased labor for PMs to read the PKGBUILDs by themselves before rejecting them.
Well I think this is up to the PMs themselves, if they say there is nothing wrong with botting the AUR to attempt to detect bad PKGBUILDs, then its them who deals with the spam. However there is a lot to be said with the response time of legitimate requests, it does seem like MarsSeed has priority, I have seen their requests be handle very quickly, while others wait for months and get caught in the backlog. I would like to say, I am aware that PMs pick what is easy to do, and leave the more complex time consuming ones for when they get more time, but it does seem like there is some bias here... possibly due to the shear quantity of requests that it overshadows individual users.
This is not right. This, an automated request filing process, should not be the way a user files their package requests. If an automated process just catches all packag es with some traits and filing requests is acceptable, then why is a user needed in the process at all? The user account is just like an API key for a bot program in that case. And if doing so is acceptable, why not just give the script or whatever tech behind the automated process to PMs so they can even save the time needed to check requests list?
Again it is up to the staff how they administrate their platform, and if they deem that this is fine, then there is nothing you can do about it.
Still, I appreciate the amount of time MarsSeed put previously on clearing the AUR. I think there might be ~10k requests filed by him ever since he joined AUR (based on a simple search of 'MarsSeed [1]' in the aur-requests mailing list), and I'd like to see him continuing on his work, but not in such robotic way.
In general, people who have the insane numbers in the AUR, use massive amounts of automation to try to replace labour time. However this, in my opinion, yields lower quality packages, and worse, causes pollution of the AUR where packages are automatically bumped but yet don't work because someone left their Github CI running for the last 5 years and haven't checked it since. These are hard to get orphaned too, because it seems like an active package, but in fact, it is just a bot keeping its pkgver up to date. Anyways, TL;DR the suggestion I made above might be a good way to help deal with this problem. Take care, -- Polarian GPG signature: 0770E5312238C760 Website: https://polarian.dev JID/XMPP: polarian@icebound.dev