Besides this issue, I already mentioned another drawback of using HTTPS: untrusted certificates (either expired, self-signed, or just signed by an untrusted CA) will cause build failure. This was a real issue for OpenWRT, so they switched to using --no-check-certificate in 2010 [1] to avoid build failures. Sources are already validated with a checksum.
Well, relying only on checksums is not good enough in my opinion. We know for a fact that hashing algorithms are built on top of *reasonable* chance that collisions won't happen. Sure, the space of a SHA256 hash is huge, but we must not just shrug over it. Because if we, as maintainers do, upstream will think they don't need to provide signed sources, because hashes are probably "good enough".
The signature itself is only a signed hash (sha256). So we do rely on the collision resistance of sha256[1] (or whatever the GPG itself uses). You are right, that hashes themselves are not enough to verify that the original author provided this source. But it gives you the guarantee that you downloaded the same source, as the maintainer(PKGBUILD writer) did. That is what integrity is all about, that is not only a checksum! The weakest spot though is the initial fetching of the source on which the maintainer relies on. However with strong hashes you can at least ensure that you (for a rebuild) download the exact same sources, as the maintainer did. You just cannot prove who published that source itself. Saying sha256 is not secure enough for that purpose would also say GPG is not safe. Correct me if I am wrong though. I'd be also nice to discuss this in the email I recently opened and not in the TU Application. I think this is a highly important topic, especially for those packages where we do not have gpg and https available and you can only rely on the hash that the maintainer gave out (AUR). [1]https://www.bestvpn.com/wp-content/uploads/2015/03/Digital_Signature_diagram... Cheers Nico