[arch-dev-public] todo list for moving http -> https sources
Hi all, There's been a sizeable number of bugs filed over the past month or so about changin PKGBUILDs to acquire sources from https rather than http. Rather than continue to flood the bug tracker, would anyone mind if I wrote a script to find instances of this and start a TODO list? This would, of course, be low priority. Even if no one does anything, we at least have a statement of work and can avoid having these "bugs" littered around flyspray. Unless there's strong opposition to this (and I'd be very interested to know why), I'll polish up my automation and create the list. d
On Sun, 2016-10-30 at 20:55 -0400, Dave Reisner wrote:
Hi all,
There's been a sizeable number of bugs filed over the past month or so about changin PKGBUILDs to acquire sources from https rather than http. Rather than continue to flood the bug tracker, would anyone mind if I wrote a script to find instances of this and start a TODO list? This would, of course, be low priority. Even if no one does anything, we at least have a statement of work and can avoid having these "bugs" littered around flyspray.
Unless there's strong opposition to this (and I'd be very interested to know why), I'll polish up my automation and create the list.
d
Hello, The few BR that reached me also requested the addition of a .sig. As I use a transparent http cache at home (2Mb/s bandwidth), so far I only added the signature, and not the https as it breaks the cache. Except the confidentiality of the request, what's the point to force https? Cheers, -- Sébastien "Seblu" Luttringer https://seblu.net | Twitter: @seblu42 GPG: 0x2072D77A
[2016-10-31 03:23:48 +0100] Sébastien Luttringer:
On Sun, 2016-10-30 at 20:55 -0400, Dave Reisner wrote:
There's been a sizeable number of bugs filed over the past month or so about changin PKGBUILDs to acquire sources from https rather than http. Rather than continue to flood the bug tracker, would anyone mind if I wrote a script to find instances of this and start a TODO list? This would, of course, be low priority. Even if no one does anything, we at least have a statement of work and can avoid having these "bugs" littered around flyspray.
Unless there's strong opposition to this (and I'd be very interested to know why), I'll polish up my automation and create the list.
The few BR that reached me also requested the addition of a .sig. As I use a transparent http cache at home (2Mb/s bandwidth), so far I only added the signature, and not the https as it breaks the cache.
Except the confidentiality of the request, what's the point to force https?
I agree with Sébastien. We should encourage upstream to digitally sign their releases, and verify their authenticity in our PKGBUILDs. Downloading releases over HTTPS gives a false sense of security: everybody knows the CA model is severely broken. In terms of security this simply does not compare with OpenPGP... In my view, switching our download links to HTTPS is nothing but an annoyance. Cheers. -- Gaetan
On Sun, Oct 30, 2016 at 04:43:04PM -1000, Gaetan Bisson wrote:
[2016-10-31 03:23:48 +0100] Sébastien Luttringer:
On Sun, 2016-10-30 at 20:55 -0400, Dave Reisner wrote:
There's been a sizeable number of bugs filed over the past month or so about changin PKGBUILDs to acquire sources from https rather than http. Rather than continue to flood the bug tracker, would anyone mind if I wrote a script to find instances of this and start a TODO list? This would, of course, be low priority. Even if no one does anything, we at least have a statement of work and can avoid having these "bugs" littered around flyspray.
Unless there's strong opposition to this (and I'd be very interested to know why), I'll polish up my automation and create the list.
The few BR that reached me also requested the addition of a .sig. As I use a transparent http cache at home (2Mb/s bandwidth), so far I only added the signature, and not the https as it breaks the cache.
Except the confidentiality of the request, what's the point to force https?
I agree with Sébastien. We should encourage upstream to digitally sign their releases, and verify their authenticity in our PKGBUILDs.
Downloading releases over HTTPS gives a false sense of security: everybody knows the CA model is severely broken. In terms of security this simply does not compare with OpenPGP... In my view, switching our download links to HTTPS is nothing but an annoyance.
The CA model is broken. http clients have bugs. http servers have bugs. pgp has bugs. sovereign states might be snooping on connections. None of these are reasons to avoid an attempt at providing another layer of security. That's all TLS is and I'm not suggesting it's some panacea. Asking every upstream to provide a PGP signature isn't a process which will scale, and some of them will likely not be interested in doing such a thing. If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner? d
Am 31.10.2016 um 15:05 schrieb Dave Reisner:
Asking every upstream to provide a PGP signature isn't a process which will scale,
I am against enforcing https for projects which provide signatures. As Sebastien pointed out, there are valid reasons against using https and it adds no benefit when using signatures. However, I agree that asking every single author to provide signatures is likely infeasible.
and some of them will likely not be interested in doing such a thing.
Having no interest in signing your work is surely a bad sign. Maybe we should look into dropping such software where we can.
If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner?
You can't. We could mirror the sources and sign them ourselves, but that would require that we actually audit the sources somehow.
On Mon, Oct 31, 2016 at 08:14:32PM +0100, Thomas Bächler wrote:
Am 31.10.2016 um 15:05 schrieb Dave Reisner:
Asking every upstream to provide a PGP signature isn't a process which will scale,
I am against enforcing https for projects which provide signatures. As Sebastien pointed out, there are valid reasons against using https and it adds no benefit when using signatures.
IMO, Sebastien didn't really provide any compelling evidence that switching to https would be an incumberance -- rather, a minor inconvenience at worst. Do you have other reasons to add? I'd be very interested to know why this is a problem. We already have a large number of sources fetched over https including several which include gpg signatures. Do you want to revert those to http? Why or why not?
However, I agree that asking every single author to provide signatures is likely infeasible.
and some of them will likely not be interested in doing such a thing.
Having no interest in signing your work is surely a bad sign. Maybe we should look into dropping such software where we can.
I don't really think you believe this...
If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner?
You can't.
We could mirror the sources and sign them ourselves, but that would require that we actually audit the sources somehow.
This, too, does not scale, and might even constitute a breach of the software's license.
On Mon, Oct 31, 2016 at 03:33:42PM -0400, Dave Reisner wrote:
On Mon, Oct 31, 2016 at 08:14:32PM +0100, Thomas Bächler wrote:
Am 31.10.2016 um 15:05 schrieb Dave Reisner:
Asking every upstream to provide a PGP signature isn't a process which will scale,
I am against enforcing https for projects which provide signatures. As Sebastien pointed out, there are valid reasons against using https and it adds no benefit when using signatures.
IMO, Sebastien didn't really provide any compelling evidence that switching to https would be an incumberance -- rather, a minor inconvenience at worst.
Do you have other reasons to add? I'd be very interested to know why this is a problem. We already have a large number of sources fetched over https including several which include gpg signatures. Do you want to revert those to http? Why or why not?
To put some ballpark numbers to this with some simple grep'ing over the PKGBUILD tree and my initial scripting work... - We have 4539 sources fetched over https - 193 of those 4539 sources also include a pgp signature - 2169 more sources could be fetched over https instead of http - 597 of those 2169 sources could include a https-fetched pgp signature
However, I agree that asking every single author to provide signatures is likely infeasible.
and some of them will likely not be interested in doing such a thing.
Having no interest in signing your work is surely a bad sign. Maybe we should look into dropping such software where we can.
I don't really think you believe this...
If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner?
You can't.
We could mirror the sources and sign them ourselves, but that would require that we actually audit the sources somehow.
This, too, does not scale, and might even constitute a breach of the software's license.
[2016-10-31 10:05:26 -0400] Dave Reisner:
On Sun, Oct 30, 2016 at 04:43:04PM -1000, Gaetan Bisson wrote:
I agree with Sébastien. We should encourage upstream to digitally sign their releases, and verify their authenticity in our PKGBUILDs.
Downloading releases over HTTPS gives a false sense of security: everybody knows the CA model is severely broken. In terms of security this simply does not compare with OpenPGP... In my view, switching our download links to HTTPS is nothing but an annoyance.
The CA model is broken. http clients have bugs. http servers have bugs. pgp has bugs. sovereign states might be snooping on connections. None of these are reasons to avoid an attempt at providing another layer of security. That's all TLS is and I'm not suggesting it's some panacea.
Asking every upstream to provide a PGP signature isn't a process which will scale, and some of them will likely not be interested in doing such a thing. If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner?
All the nuances in my message were apparently lost on you... I said OpenPGP provides a much higher degree of security than HTTPS, so that's what we should strive to use. Obviously, for cases where digital signatures aren't available, downloading sources over HTTPS is better than nothing. What I argued, however, is that it's not much better than nothing, so we shouldn't become complacent and trust sources just because they came over TLS. Cheers. -- Gaetan
On Mon, Oct 31, 2016 at 04:09:40PM -1000, Gaetan Bisson wrote:
[2016-10-31 10:05:26 -0400] Dave Reisner:
On Sun, Oct 30, 2016 at 04:43:04PM -1000, Gaetan Bisson wrote:
I agree with Sébastien. We should encourage upstream to digitally sign their releases, and verify their authenticity in our PKGBUILDs.
Downloading releases over HTTPS gives a false sense of security: everybody knows the CA model is severely broken. In terms of security this simply does not compare with OpenPGP... In my view, switching our download links to HTTPS is nothing but an annoyance.
The CA model is broken. http clients have bugs. http servers have bugs. pgp has bugs. sovereign states might be snooping on connections. None of these are reasons to avoid an attempt at providing another layer of security. That's all TLS is and I'm not suggesting it's some panacea.
Asking every upstream to provide a PGP signature isn't a process which will scale, and some of them will likely not be interested in doing such a thing. If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner?
All the nuances in my message were apparently lost on you...
I said OpenPGP provides a much higher degree of security than HTTPS, so that's what we should strive to use. Obviously, for cases where digital signatures aren't available, downloading sources over HTTPS is better than nothing. What I argued, however, is that it's not much better than nothing, so we shouldn't become complacent and trust sources just because they came over TLS.
Cheers.
-- Gaetan
I'll take this to mean that you don't have any objections about adding additional layers of security. d
[2016-11-01 09:55:11 -0400] Dave Reisner:
On Mon, Oct 31, 2016 at 04:09:40PM -1000, Gaetan Bisson wrote:
[2016-10-31 10:05:26 -0400] Dave Reisner:
On Sun, Oct 30, 2016 at 04:43:04PM -1000, Gaetan Bisson wrote:
I agree with Sébastien. We should encourage upstream to digitally sign their releases, and verify their authenticity in our PKGBUILDs.
Downloading releases over HTTPS gives a false sense of security: everybody knows the CA model is severely broken. In terms of security this simply does not compare with OpenPGP... In my view, switching our download links to HTTPS is nothing but an annoyance.
The CA model is broken. http clients have bugs. http servers have bugs. pgp has bugs. sovereign states might be snooping on connections. None of these are reasons to avoid an attempt at providing another layer of security. That's all TLS is and I'm not suggesting it's some panacea.
Asking every upstream to provide a PGP signature isn't a process which will scale, and some of them will likely not be interested in doing such a thing. If an upstream won't provide PGP signatures, do you have another suggestion as to how we can secure our process of obtaining upstream sources in a reliable manner?
All the nuances in my message were apparently lost on you...
I said OpenPGP provides a much higher degree of security than HTTPS, so that's what we should strive to use. Obviously, for cases where digital signatures aren't available, downloading sources over HTTPS is better than nothing. What I argued, however, is that it's not much better than nothing, so we shouldn't become complacent and trust sources just because they came over TLS.
I'll take this to mean that you don't have any objections about adding additional layers of security.
My point is they're not "additional layers of security", just "additional layers". But whatever, if you feel that strongly about this, go right ahead. -- Gaetan
On Mon, Oct 31, 2016 at 03:23:48AM +0100, Sébastien Luttringer wrote:
On Sun, 2016-10-30 at 20:55 -0400, Dave Reisner wrote:
Hi all,
There's been a sizeable number of bugs filed over the past month or so about changin PKGBUILDs to acquire sources from https rather than http. Rather than continue to flood the bug tracker, would anyone mind if I wrote a script to find instances of this and start a TODO list? This would, of course, be low priority. Even if no one does anything, we at least have a statement of work and can avoid having these "bugs" littered around flyspray.
Unless there's strong opposition to this (and I'd be very interested to know why), I'll polish up my automation and create the list.
d
Hello,
The few BR that reached me also requested the addition of a .sig.
Yes, this was raised on IRC as well. I'm going to do this in a separate pass.
As I use a transparent http cache at home (2Mb/s bandwidth), so far I only added the signature, and not the https as it breaks the cache.
This doesn't seem to hold much weight. You're duplicating the source tarball now, as it exists (on disk?) in your http cache and in makepkg's SRCDEST. I'm not sure I see the benefit to doing this, particularly since the caching in SRCDEST is entirely agnostic to the protocol used to fetch it.
Except the confidentiality of the request, what's the point to force https?
Security of sources, particularly those which we obtain without any upstream verification mechanism such as a checksum or PGP signature. Even for those with signatures or checksums, you must consider that security is not a binary thing, and is always approached in layers. d
On Sun, 2016-10-30 at 22:47 -0400, Dave Reisner wrote:
On Mon, Oct 31, 2016 at 03:23:48AM +0100, Sébastien Luttringer wrote:
On Sun, 2016-10-30 at 20:55 -0400, Dave Reisner wrote: As I use a transparent http cache at home (2Mb/s bandwidth), so far I only added the signature, and not the https as it breaks the cache.
This doesn't seem to hold much weight. You're duplicating the source tarball now, as it exists (on disk?) in your http cache and in makepkg's SRCDEST. I'm not sure I see the benefit to doing this, particularly since the caching in SRCDEST is entirely agnostic to the protocol used to fetch it.
Over the time, I found a problem using $SRCDEST; it doesn't check if upstream sources have been modified since. I've been tricked few times, releasing packages with my local tarballs and not the one available to others. Maybe it's something which can be improved directly in makepkg. My point was to explain why I didn't already switched to https on all my packages, and see if we have a place in our rules for letting this happen, or if we want enforce the all case. This, in some way, reduce the choice of the maintainer to select which sources he prefers. I didn't bring that as a show stopper for improving the source transport security.
Except the confidentiality of the request, what's the point to force https?
Security of sources, particularly those which we obtain without any upstream verification mechanism such as a checksum or PGP signature. Even for those with signatures or checksums, you must consider that security is not a binary thing, and is always approached in layers.
I understand security is not binary. TLS is about security of the transportation of sources, not the security of sources themselves, that's why I asked, to know what you had in mind. My definition of securing the sources, is a way to trust the sources at the build time, no matter the way they were fetched. I want to be sure that my sources are "correct" even if I get them by usb key, ftp, rsync or even if they were not corrupted locally by a btrfs bug. And when possible, I want to be sure that the server (mirror or not) was not compromised (even at the first fetch). Keeping that in mind, enforcing tls, doesn't improve much the source security. In fact, it improves only security during the transportation of the sources at the cost of the caching. So, even though I a partisan of tls everywhere, I still balanced by the caching. Cheers, -- Sébastien "Seblu" Luttringer https://seblu.net | Twitter: @seblu42 GPG: 0x2072D77A
On 01/11, Sébastien Luttringer wrote:
On Sun, 2016-10-30 at 22:47 -0400, Dave Reisner wrote:
On Mon, Oct 31, 2016 at 03:23:48AM +0100, Sébastien Luttringer wrote:
On Sun, 2016-10-30 at 20:55 -0400, Dave Reisner wrote: As I use a transparent http cache at home (2Mb/s bandwidth), so far I only added the signature, and not the https as it breaks the cache.
This doesn't seem to hold much weight. You're duplicating the source tarball now, as it exists (on disk?) in your http cache and in makepkg's SRCDEST. I'm not sure I see the benefit to doing this, particularly since the caching in SRCDEST is entirely agnostic to the protocol used to fetch it.
Over the time, I found a problem using $SRCDEST; it doesn't check if upstream sources have been modified since. I've been tricked few times, releasing packages with my local tarballs and not the one available to others. Maybe it's something which can be improved directly in makepkg.
Mmm, it probably should check if it's been modified, and if so, complain loudly. -- Sincerely, Johannes Löthberg PGP Key ID: 0x50FB9B273A9D0BB5 https://theos.kyriasis.com/~kyrias/
participants (5)
-
Dave Reisner
-
Gaetan Bisson
-
Johannes Löthberg
-
Sébastien Luttringer
-
Thomas Bächler