Back

Gpg.fail

450 points1 monthgpg.fail
oefrha1 month ago

Okay, since there’s so much stuff to digest here and apparently there are issues designated as wontfix by GnuPG maintainers, can someone more in the loop tell us whether using gpg signatures on git commits/tags is vulnerable? And is there any better alternative going forward? Like is signing with SSH keys considered more secure now? I certainly want to get rid of gpg from my life if I can, but I also need to make sure commits/tags bearing my name actually come from me.

tptacek1 month ago

One of those WONTFIX's is on an insane vulnerability: you can bitflip known plaintext in a PGP message to switch it into handling compression, allowing attackers to instruct GnuPG packet processing to look back to arbitrary positions in the message, all while suppressing the authentication failure message. GPG's position was: they print, in those circumstances, an error of some sort, and that's enough. It's an attack that reveals plaintext bytes!

upofadown1 month ago

Are you referring to "Encrypted message malleability checks are incorrectly enforced causing plaintext recovery attacks"?

Seems like a legitimate difference of opinion. The researcher wants a message with an invalid format to return an integrity failure message. Presumably the GnuPGP project thinks that would be better handled by some sort of bad format error.

The exploit here is a variation on the age old idea of tricking a PGP user into decrypting an encrypted message and then sending the result to the attacker. The novelty here is the idea of making the encrypted message look like a PGP key (identity) and then asking the victim to decrypt the fake key, sign it and then upload it to a keyserver.

Modifying a PGP message file will break the normal PGP authentication[1] (that was not acknowledged in the attack description). So here is the exploit:

* The victim receives a unauthenticated/anonymous (unsigned or with a broken signature) message from the attacker. The message looks like a public key.

* Somehow (perhaps in another anonymous message) the attacker claims they are someone the victim knows and asks them to decrypt, sign and upload the signed public key to a keyserver.

* They see nothing wrong with any of this and actually do what the attacker wants ignoring the error message about the bad message format.

So this attack is also quite unlikely. Possibly that affected the decision of the GnuPG project to not change behaviour in this case, particularly when such a change could possibly introduce other vulnerabilities.

[1] https://articles.59.ca/doku.php?id=pgpfan:pgpauth

Added: Wait. How would the victim import the bogus PGP key into GPG so they could sign it? There would normally be a preexisting key for that user so the bogus key would for sure fail to import. It would probably fail anyway. It will be interesting to see what the GnuPG project said about this in their response.

tptacek1 month ago

In the course of this attack, just in terms of what happens in the mechanics of the actual protocol, irrespective of the scenario in which these capabilities are abused, the attacker:

(1) Rewrites the ciphertext of a PGP message

(2) Introducing an entire new PGP packet

(3) That flips GPG into DEFLATE compression handling

(4) And then reroutes the handling of the subsequent real message

(5) Into something parsed as a plaintext comment

This happens without a security message, but rather just (apparently) a zlib error.

In the scenario presented at CCC, they used the keyserver example to demonstrate plaintext exfiltration. I kind of don't care. It's what's happening under the hood that's batshit; the "difference of opinion" is that the GnuPG maintainers (and, I guess, you) think this is an acceptable end state for an encryption tool.

akulbe1 month ago

Is there a better alternative to GPG?

tptacek1 month ago

Everything is better than PGP (not just GPG --- all PGP implementations).

The problem with PGP is that it's a Swiss Army Knife. It does too many things. The scissors on a Swiss Army Knife are useful in a pinch if you don't have real scissors, but tailors use real scissors.

Whatever it is you're trying to do with encryption, you should use the real tool designed for that task. Different tasks want altogether different cryptosystems with different tradeoffs. There's no one perfect multitasking tool.

When you look at the problem that way, surprisingly few real-world problems ask for "encrypt a file". People need backup, but backup demands backup cryptosystems, which do much more than just encrypt individual files. People need messaging, but messaging is wildly more complicated than file encryption. And of course people want packet signatures, ironically PGP's most mainstream usage, ironic because it relies on only a tiny fraction of PGP's functionality and still somehow doesn't work.

All that is before you get to the absolutely deranged 1990s design of PGP, which is a complex state machine that switches between different modes of operation based on attacker-controlled records (which are mostly invisible to users). Nothing modern looks like PGP, because PGP's underlying design predates modern cryptography. It survives only because nerds have a parasocial relationship with it.

+1
palata1 month ago
+3
johnisgood1 month ago
+3
benchloftbrunch1 month ago
+1
josephg1 month ago
coppsilgold1 month ago

Depending on what you are after, an alternative could be using SSH keys for signatures and age[1] for encryption targeting SSH keys.

[1] <https://github.com/FiloSottile/age>

baobun1 month ago

sq (sequoia) is compatible and is available in your favorite distro. It's the recommended replacement.

https://book.sequoia-pgp.org/about_sequoia.html

zimmerfrei1 month ago

This is the right answer.

The problem mostly concerns the oldest parts of PGP (the protocol), which gpg (the implementation) doesn't want or cannot get rid of.

lagniappe1 month ago
vbezhenar1 month ago

age

alphazard1 month ago

It's a fundamentally bad idea to have a single key that applications are supposed to look for in a particular place, and then use to sign things. There is inherent complexity involved in making multi-context key use safe, and it's better to just avoid it architecturally.

Keys (even quantum safe) are small enough that having one per application is not a problem at all. If an application needs multi-context, they can handle it themselves. If they do it badly, the damage is contained to that application. If someone really wants to make an application that just signs keys for other applications to say "this is John Smith's key for git" and "this is John Smith's key for email" then they could do that. Such an application would not need to concern itself with permissions for other applications calling into it. The user could just copy and paste public keys, or fingerprints when they want to attest to their identity in a specific application.

The keyring circus (which is how GPG most commonly intrudes into my life) is crazy too. All these applications insist on connecting to some kind of GPG keyring instead of just writing the secrets to the filesystem in their own local storage. The disk is fully encrypted, and applications should be isolated from one another. Nothing is really being accomplished by requiring the complexity of yet another program to "extra encrypt" things before writing them to disk.

I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.

akerl_1 month ago

> The disk is fully encrypted, and applications should be isolated from one another.

For most apps on non-mobile devices, there isn't filesystem isolation between apps. Disk/device-level encryption solves for a totally different threat model; Apple/Microsoft/Google all ship encrypted storage for secrets (Keychain, Credential Manager, etc), because restricting key material access within the OS has merit.

> I'm sure these bad ideas come from the busy work invented in corporate "security" circles, which invent complexity to keep people employed without any regard for an actual threat model.

Basically everything in PGP/GPG predates the existence of "corporate security circles".

Avamander1 month ago

> For most apps on non-mobile devices, there isn't filesystem isolation between apps.

If there isn't there should be. At least my Flatpaks are isolated from each other.

> Apple/Microsoft/Google all ship encrypted storage for secrets (Keychain, Credential Manager, etc), because restricting key material access within the OS has merit.

The Linux equivalents are suspicious and stuck in the past to say the least. Depending on them is extra tedious on top of the tediousness of any PGP keyrings, god forbid a combination of the two.

> Basically everything in PGP/GPG predates the existence of "corporate security circles".

Then we know where this stuff came from.

+1
akerl_1 month ago
deknos1 month ago

and now certain people in corporate security only trust gpg, because they grew up with it :D

xorcist1 month ago

These are not vulnerabilities in the "remote exploit" sense. They should be taken seriously, you should be careful not to run local software on untrusted data, and GPG should probably do more to protect users from shooting themselves in the foot, but the worst thing you could do is panic and throw out a process your partners and colleagues trust. There is nothing here that will disturb your workflow signing commits or apt-get install-ing from your distribution.

If you use crypographic command line tools to verify data sent to you, be mindful on what you are doing and make sure to understand the attacks presented here. One of the slides is titled "should we even use command line tools" and yes, we should because the alternative is worse, but we must be diligent in treating all untrusted data as adversarial.

akerl_1 month ago

A huge part of GPG’s purported use case is getting a signed/encrypted/both blob from somebody and using GPG to confirm it’s authentic. This is true for packages you download and for commits with signatures.

Handling untrusted input is core to that.

xorcist1 month ago

It is, and other software handling untrusted data should also treat it as adversarial. For example, your package tool should probably not output raw package metadata to the terminal.

akerl_1 month ago

I think you’re missing the forest for the trees.

tgsovlerkhgsel1 month ago

It reads to me like attempting to verify a malicious ascii-armoured signature is potential RCE.

larusso1 month ago

I did the switch this year after getting yet another personal computer. I have 4 in total (work laptop, personal sofa laptop, Mac Mini, Linux Tower). I used Yubi keys with gpg and resident ssh keys. All is fine but the configuration needed to get it too work on all the machines. I also tend to forget the finer details and have to relearn the skills of fetching the public keys into the keychain etc. I got rid of this all by moving to 1Password ssh agent and git ssh signing. Removes a lot of headaches from my ssh setup. I still have the yubi key(s) though as a 2nd factor for certain web services. And the gpg agent is still running but only as a fallback. I will turn this off next year.

snorremd1 month ago

I’ve ended up the same place as you. I had previously set up my gpg key on a Yubikey and even used that gpg key to handle ssh authentication. Then at some point it just stopped working, maybe the hardware on my key broke. 2FA still works though.

In any case I figured storing an SSH key in 1Password and using the integrated SSH socket server with my ssh client and git was pretty nice and secure enough. The fact the private key never leaves the 1Password vault unencrypted and is synced between my devices is pretty neat. From a security standpoint it is indeed a step down from having my key on a physical key device, but the hassle of setting up a new Yubikey was not quite worth it.

I’m sure 1Password is not much better than having a passphrase-protected key on disk. But it’s a lot more convenient.

DetectDefect1 month ago

> I had previously set up my gpg key on a Yubikey and even used that gpg key to handle ssh authentication. Then at some point it just stopped working, maybe the hardware on my key broke

Did you try to SSH in verbose mode to ascertain any errors? Why did you assume the hardware "broke" without anyone objective qualifications of an actual failure condition?

> I figured storing an SSH key in 1Password and using the integrated SSH socket server with my ssh client and git was pretty nice and secure enough

How is trusting a closed-source, for-profit, subscription-based application with your SSH credential "secure enough"?

Choosing convenience over security is certainly not unreasonable, but claiming both are achieved without any compromise borders on ludicrous.

hirako20001 month ago

How is 1password safer than the local keychain?

larusso1 month ago

The keys never leave the 1Password store. So you don’t have the keys on the local file system. That and that these keys are shared over the cloud was the seller for me. I guess security wise it’s a bit of a downgrade compared to resident keys. But the agent support agent forwarding etc which wasn’t really working with yubi ssh resident keys. Also worth mentioning that I use 1Password. Bitwarden has a similar feature as far as I know. For the ones who want to self host etc might be the even better solution.

akerl_1 month ago

> The keys never leave the 1Password store. So you don’t have the keys on the local file system.

Keychain and 1Password are doing variants of the same thing here: both store an encrypted vault and then give you credentials by decrypting the contents of that vault.

hk13371 month ago

> 1Password ssh agent and git ssh signing

I’m still working through how to use this but I have it basically setup and it’s great!

65a1 month ago

> I certainly want to get rid of gpg from my life if I can

I see this sentiment a lot, but you later hint at the problem. Any "replacement" needs to solve for secure key distribution. Signing isn't hard, you can use a lot of different things other than gpg to sign something with a key securely. If that part of gpg is broken, it's a bug, it can/should be fixed.

The real challenge is distributing the key so someone else can verify the signature, and almost every way to do that is fundamentally flawed, introduces a risk of operational errors or is annoying (web of trust, trust on first use, central authority, in-person, etc). I'm not convinced the right answer here is "invent a new one and the ecosystem around it".

akerl_1 month ago

It's not like GPG solves for secure key distribution. GPG keyservers are a mess, and you can't trust their contents anyways unless you have an out of band way to validate the public key. Basically nobody is using web-of-trust for this in the way that GPG envisioned.

This is why basically every modern usage of GPG either doesn't rely on key distribution (because you already know what key you want to trust via a pre-established channel) or devolves to the other party serving up their pubkey over HTTPS on their website.

65a1 month ago

Yes, not saying that web of trust ever worked. "Pre-established channel" are the other mechanisms I mentioned, like a central authority (https) or TOFU (just trust the first key you get). All of these have some issues, that any alternative must also solve for.

akerl_1 month ago

So if we need a pre-established channel anyways, why would people recommending a replacement for GPG workflows need to solve for secure key distribution?

This is a bit like looking at electric cars and saying ~"well you can't claim to be a viable replacement for gas cars until you can solve flight"

woodruffw1 month ago

A lot of people are using PGP for things that don’t require any kind of key distribution. If you’re just using it to encrypt files (even between pointwise parties), you can probably just switch to age.

(We’re also long past the point where key distribution has been a significant component of the PGP ecosystem. The PGP web of trust and original key servers have been dead and buried for years.)

kaoD1 month ago

This is not the first time I see "secure key distribution" mentioned in HN+(GPG alternatives) context and I'm a bit puzzled.

What do you mean? Web of Trust? Keyservers? A combination of both? Under what use case?

kpil1 month ago

I'm assuming they mean the old way of signing each others signatures.

As a practical implementation of "six degrees of Kevin Bacon", you could get an organic trust chain to random people.

Or at least, more realistically, to few nerds. I think I signed 3-4 peoples signatures.

The process had - as they say - a low WAF.

+1
dale_glass1 month ago
65a1 month ago

In a signature context, you probably want someone else to know that "you" signed it (I can think of other cases, but that's the usual one). The way to do that requires them to know that the key which signed the data belongs to you. My only point is that this is actually the hard part, which any "replacement" crypto system needs to solve for, and that solving that is hard (none of the methods are particularly good).

Avamander1 month ago

> The way to do that requires them to know that the key which signed the data belongs to you.

This is something S/MIME does and I wouldn't say it doesn't do so well. You can start from mailbox validation and that already beats everything PGP has to offer in terms of ownership validation. If you do identity validation or it's a national PKI issuing the certificate (like in some countries) it's a very strong guarantee of ownership. Coughing baby (PGP) vs hydrogen bomb level of difference.

It much more sounds to me like an excuse to use PGP when it doesn't even remotely offer what you want from a replacement.

afiori1 month ago

I think it should be mostly ad-hoc methods:

if you have a website put your keys in a dedicated page and direct people there

If you are in an org there can be whatever kind of centralised repo

Add the hashes to your email signature and/or profile bios

There might be a nice uniform solution using DNS and derived keys like certificate chains? I am not sure but I think it might not be necessary

antonvs1 month ago

I haven't gone through the list in detail, but I don't see anything there that implies the ability to forge a valid signature without the private key, which is what matters most for git commits.

Most of the entries have to do with ways to compromise the unencrypted text presented to the user, so that the displayed message doesn't match the signed message. This allows for multiple different kinds of exploit.

But in the git commit case the main thing we care about, for commits authored by anyone whose signature we trust, is that the actual commit matches the signature, and git itself enforces that.

Of course, it's possible that a malicious user could construct a commit that expands to something misleading (with or without GPG). But that comes back to the point of signatures in the first place - if your repo allows random anonymous people to push signed commits, then you might have an issue.

jwr1 month ago

It has become fashionable to s*t on GnuPG. I just wish all the crypto experts doing that would point me to an alternative that is functionally equivalent.

Something that will encrypt using AES-256 with a passphrase, but also using asymmetric crypto. Oh, and I want my secret keys printable if needed. And I want to store them securely on YubiKeys once generated (https://github.com/drduh/YubiKey-Guide). I want to be able to encrypt my backups to multiple recipients. And I want the same keys (stored on Yubikeys, remember?) to be usable for SSH authentication, too.

And by the way, if your fancy tool is written using the latest language du jour with a runtime that changes every couple of years or so, or requires huge piles of dependencies that break if you even as much as sneeze (python, anyone?), it won't do.

BTW, in case someone says "age", I actually followed that advice and set it up just to be there on my systems (managed by ansible). Apart from the fact that it really slowed down my deployments, the thing broke within a year. And I didn't even use it. I just wanted to see how reliable it will be in the most minimal of ways: by having it auto-installed on my systems.

If your fancy tool has less than 5 years of proven maintenance record, it won't do. Encryption is for the long term. I want to be able to read my stuff in 15-30 years.

So before you go all criticizing GnuPG, please understand that there are reasons why people still use it, and are actually OK with the flaws described.

woodruffw1 month ago

> I just wish all the crypto experts doing that would point me to an alternative that is functionally equivalent.

The entire point of every single valid criticism of PGP is that you cannot make a single functionally equivalent alternative to PGP. You must use individual tools that are good at specific things, because the "Swiss Army knife" approach to cryptographic tool design has yielded empirically poor outcomes.

If you have an example of how age broke for you, I think its maintainers would be very interested in hearing that -- I've been using it directly and indirectly for 5+ years and haven't had any compatibility or runtime issues with it, including when sharing encrypted files across different implementations of age.

tptacek1 month ago

Point of order: there are valid and important criticisms of PGP that have nothing to do with its jack-of-all-trades philosophy. There's no modern cryptosystem in the world you would design with PGP's packet scheme.

woodruffw1 month ago

Yeah, that was just the low-hanging fruit I reached for.

(I think you can make a tie-in argument here, though: PGP's packet design and the state machine that falls out of it is a knock-on effect of how many things PGP tries to do. PGP would maybe not have such a ridiculously complicated packet design if it didn't try and do so many things.)

some_furry1 month ago

> Apart from the fact that it really slowed down my deployments, the thing broke within a year. And I didn't even use it. I just wanted to see how reliable it will be in the most minimal of ways: by having it auto-installed on my systems.

I'm very curious about this. Tell me more.

tptacek1 month ago

I didn't even catch this the first read. `age` is a command line program written in Go. It's not a system service. Simply "having it installed" on your system can't do anything.

fn-mote1 month ago

If it fails to build when the system is updated?

Poster says:

> slowed down my deployments

I take that to mean the _deployment_ step, not the deployed system.

technion1 month ago

There is a downloadable binary, I doubt many people recommending age are recommending every server using it also download a Go compiler and build it themselves.

jwr1 month ago

What I meant was that the ansible recipe for building and installing age broke within a year. I didn't investigate why, I just switched it off, but it was a data point.

Yes, I know this can surely be explained and isn't a "fair" comparison. But then again my time is limited and I need predictable, reliable tools.

dwattttt1 month ago

> Apart from the fact that it really slowed down my deployments

Is this a comparable complaint worth mentioning, and if it is are you sure you actually need cryptography? It slowed things down a bit, so you don't really want to move on from demonstrably too-complex to not have bugs GnuPG?

Natanael_L1 month ago

Asking for an equivalent to GPG is like asking for an equivalent of a Swiss knife with unshielded chainsaws and laser cutters.

Stop asking for it, for your own good, please. If you don't understand the entire spec you can't use it safely.

You want special purpose tools. Signal for communication, Age for safer file encryption, etc.

What exact problems did you have with age? You're not explaining how it broke anything. Are you compiling yourself? Age has yubikey support and can do all you described.

> if your fancy tool has less than 5 years of proven maintenance record, it won't do. Encryption is for the long term. I want to be able to read my stuff in 15-30 years.

This applies to algorithms, it does not apply to cryptographic software in the same way. The state of art changes fast, and while algorithms tend to stand for a long time these days there are significant changes in protocol designs and attack methods.

Downgrade protection, malleability protection, sidechannel protection, disambiguation, context binding, etc...

You want software to be implemented by experts using known best practices with good algorithms and audited by other experts.

baobun1 month ago

If you haven't checked out sequoia (sq), you should! I think it ticks your boxes.

https://book.sequoia-pgp.org/about_sequoia.html

cykros1 month ago

There's something to be said perhaps for preferring tools that do one of those things, rather than all of those things, and doing them well.

Not to say you can't then make an umbrella interface for interacting with them all as a suite, but perhaps the issue has become that gpg has not appropriately followed the Unix philosophy to begin with.

Not that I've got the solution for you. Just calling out the nature of your demands somewhat being at odds with a key design principle that made Unix and Unix-likes great to begin with.

tptacek1 month ago

There isn't an alternative that is functionally equivalent because what PGP does is dumb. It's a Swiss Army Knife. Nobody who wants to design an excellent saw sets out to design the Swiss Army Knife saw[†]. Nobody who needs shears professionally buys a Swiss Army Knife for the scissors.

The cryptographic requirements of different problems --- backup, package signing, god-help-us secure messaging --- are in tension with each other. No one design adequately covers all the use cases. Trying to cram them all into one tool is a sign that something other than security is the goal. If that's the case, you're live action roleplaying, not protecting people.

I'd be interested in whether you could find a cryptographer who disagrees with that. I've asked around!

[†] I am aware that SAK nerds love the saw.

picture1 month ago

So what toolbag or workshop of excellent specialized tools would provide the same capability as GnuPG?

tptacek1 month ago

Ask me a question about a specific realistic problem (ie, not "how do I replicate this behavior of PGP", but rather "how do I solve this real-world problem") and I'll give an answer (or someone else will).

+2
jwr1 month ago
jmclnx1 month ago

I believe all the criticism of GnuPG is due to the fact most people grew up with Microsoft or Apple, so they are use to hand-holding.

If you read the various how-tos out there it is not that hard to use, just people do not want to read anything more than 2 lines. That is the main issue.

My only complaint is Thunderbird now uses its own homegrown encryption, thus locking you into their email client. Seems almost all email clients have their own way of encryption, confusing the matters even more. I now use mutt because it can be easily likned to GnuPG and it does not lock me into a specific client.

woodruffw1 month ago

> If you read the various how-tos out there it is not that hard to use, just people do not want to read anything more than 2 lines. That is the main issue.

The video linked above contains multiple examples of people using GnuPG's CLI in ways that it was seemingly intended to be used. Blaming users for holding it wrong seems facile.

rurban1 month ago

Zero-days from the CCC talk https://fahrplan.events.ccc.de/congress/2025/fahrplan/event/...

But trust in Werner Koch is gone. Wontfix??

corndoge1 month ago

I am curious what you mean by "trust in Werner Koch is gone". Can you elaborate?

karambahh1 month ago

OP is complaining about GPG team rejecting issues with "wontfix" statuses.

rawmaterial1 month ago

'Team' is Werner.

cpach1 month ago

To be frank, at this point, GPG has been a lost cause for basically decades.

People who are serious about security use newer, better tools that replace GPG. But keep in mind, there’s no “one ring to rule them all”.

perching_aix1 month ago

What are those better tools? I've been broadly looking into this space, but never ventured too deep.

ameliaquining1 month ago
+3
p2detar1 month ago
singpolyma31 month ago

Sequoia for example has been doing a great job and implements the latest version of the standard which brings a lot of cryptography up to date

+1
perching_aix1 month ago
+1
akerl_1 month ago
arccy1 month ago

ssh or minisign for signing age for file encryption

+1
johnisgood1 month ago
+1
pabs31 month ago
ghickPit1 month ago

> To be frank, at this point, GPG has been a lost cause for basically decades.

Why do high-profile projects, such as Linux and QEMU, still use GPG for signing pull requests / tags?

https://docs.kernel.org/process/maintainer-pgp-guide.html

https://www.qemu.org/docs/master/devel/submitting-a-pull-req...

Why does Fedora / RPM still rely on GPG keys for verifying packages?

This is a staggering ecosystem failure. If GPG has been a known-lost cause for decades, then why haven't alternatives ^W replacements been produced for decades?

talideon1 month ago

Let's not conflate GPG and PGP-in-general. RPM doesn't use GPG, it uses Sequoia PGP.

GPG is what GP is referring to as a lost cause. Now, it can be debated whether PGP-in-general is a lost cause too, but that's not what GP is claiming.

+2
ghickPit1 month ago
derleyici1 month ago

Werner Koch from GnuPG recently (2025-12-26) posted this on their blog: https://www.gnupg.org/blog/20251226-cleartext-signatures.htm...

Archive link: https://web.archive.org/web/20251227174414/https://www.gnupg...

woodruffw1 month ago

This feels pretty unsatisfying: something that’s been “considered harmful” for three decades should be deprecated and then removed in a responsible ecosystem.

(PGP/GPG are of course hamstrung by their own decision to be a Swiss Army knife/only loosely coupled to the secure operation itself. So the even more responsible thing to do is to discard them for purposes that they can’t offer security properties for, which is the vast majority of things they get used for.)

LtWorf1 month ago

Well python discarded signing entirely so that's one way to solve it :)

woodruffw1 month ago

Both CPython and distributions on PyPI are more effectively signed than they were before.

(I think you already know this, but want to relitigate something that’s not meaningfully controversial in Python.)

+1
LtWorf1 month ago
cpach1 month ago

GPG is indeed deprecated.

Most people have never heard of it and never used it.

woodruffw1 month ago

Can you provide a source this? To my understanding, the GnuPG project (and by extension PGP as an ecosystem) considers itself very much alive, even though practically speaking it’s effectively moribund and irrelevant.

(So I agree that it’s de facto dead, but that’s not the same thing as formal deprecation. The latter is what you do explicitly to responsibly move people away from something that’s not suitable for use anymore.)

cpach1 month ago

Ah. I meant in the de facto sense.

IshKebab1 month ago

I would be very much surprised if GPG has ever really achieved anything other than allowing crypto nerds to proclaim that things were encrypted or signed. Good for them I guess, but not of any practical importance, unlike SSH, TLS, 7Zip encryption, etc.

tptacek1 month ago

They allow some kind of nerd to claim that, but nobody who nerds out on cryptography defends PGP. Cryptographers hate PGP.

Valodim1 month ago

This doesn't explain why he decided to WONTFIX what is obviously a parser bug that allows injection of data into output through the headers.

But werner at this point has a history of irresponsible decisions like this, so it's sadly par for the course by now.

Another particularly egregious example: https://dev.gnupg.org/T4493

hendi_1 month ago

[flagged]

derleyici1 month ago

i wouldn't normally reply to drive-by corrections, but this is wrong.

it's the GnuPG blog on gnupg.org with multiple authors.

this is a post by Werner Koch, not his blog.

stackghost1 month ago

[flagged]

smallerize1 month ago

Seems to be down? Here's a thread with a summary of exploits presented in the talk: https://bsky.app/profile/filippo.abyssdomain.expert/post/3ma...

orblivion1 month ago

Maybe the site is overloaded. But as for the "brb, were on it!!!!" - this page had the live stream of the talk when it was happening. Hopefully they'll replace it with the recording when media.ccc.de posts it, which should be within a couple hours.

kleiba1 month ago

> this page had the live stream of the talk when it was happening

As they said, they were on it...

selfbottle1 month ago

it's online now

orblivion1 month ago

Took me a second but I got your joke

karambahh1 month ago

Also expect contents referred in the slides (every "chapter" of the presentation referred to a url such as https://gpg.fail/clearsig or https://gpg.fail/minisig and so on)

Valodim1 month ago

For anyone relatedly wondering about the "schism", i.e. GnuPG abandoning the OpenPGP standard and doing their own self-governed thing, I found this email particularly insightful on the matter: https://lists.gnupg.org/pipermail/gnupg-devel/2025-September...

> As others have pointed out, GnuPG is a C codebase with a long history (going on 28 years). On top of that, it's a codebase that is mostly uncovered by tests, and has no automated CI. If GnuPG were my project, I would also be anxious about each change I make. I believe that because of this the LibrePGP draft errs on the side of making minimal changes, with the unspoken goal of limiting risks of breakage in a brittle codebase with practically no tests. (Maybe the new formats in RFC 9580 are indeed "too radical" of an evolutionary step to safely implement in GnuPG. But that's surely not a failing of RFC 9580.)

upofadown1 month ago

Here is my take on the OpenPGP standards schism:

* https://articles.59.ca/doku.php?id=pgpfan:schism

Nothing has improved and everything has gotten worse since I wrote that. Both factions are sleepwalking into an interoperability disaster. Supporting one faction or the other just means you are part of the problem. The users have to resist being made pawns in this pointless war.

>Maybe the new formats in RFC 9580 are indeed "too radical" of an evolutionary step to safely implement in GnuPG.

Traditionally the OpenPGP process has been based on minimalism and rejected everything without a strong justification. RFC-9580 is basically everything that was rejected by the LibrePGP faction (GnuPG) in the last attempt to come up with a new standard. It contains a lot of poorly justified stuff and some straight up pointless stuff. So just supporting RFC-9580 is not the answer here. It would require significant cleaning up. But again, just supporting LibrePGP is not the answer either. The process has failed yet again and we need to recognize that.

pseudohadamard1 month ago

>RFC-9580 is basically everything that was rejected by the LibrePGP faction (GnuPG) in the last attempt to come up with a new standard.

That sentence is too long, it should read:

>RFC-9580 is basically everything

The RFC has every idea that anyone involved in its creation ever thought of tossed into it. It's over a hundred pages long and just keeps going and going and going. Want AEAD? We have three of them, two of which have essentially zero support in crypto libraries. PRFs? We've got four, and then five different ways to apply them to secret-key encryption. PKC algorithms? There's a dozen of them, some parameterized so there's up to half a dozen subtypes, including mystery ones like EdDSALegacy which seems to be identical to Ed25519 but gets treated differently. Compression algorithms? There are three you can choose from. You get the idea.

And then there's the Lovecraftian horror of the signature packets with their infinite subtypes and binding signatures and certification signatures and cross-signatures and revocation signatures and timestamp signatures and confirmation signatures and signature signatures and signature signature signatures and signature signature signature aarrgghhh I'm going insane!

The only way you can possibly work with this mess without losing your mind is to look at what { GPG, Sequioa } will accept and then implement a minimal intersection of the two, so you can talk to the two major implementations without ending up in a madhouse.

Valodim1 month ago

Here is the short version from someone who took part in this process: while serving as the editor of the draft, Werner did not let anything into the draft that wasn't his own idea. But for his own ideas, there were cases where a new feature was committed to spec master and released in gnupg within the week. He was impossible to work with over many years, to the point that everyone agreed that the only way forward was to leave gnupg behind. This is a bonkers decision for OpenPGP as an ecosystem, but it was not made in ignorance of the consequences. And as far as I'm aware, even with today's hindsight, noone involved in the process regrets making the decision.

upofadown1 month ago

Yes, the OpenPGP standards schism was all about personality conflicts. Those conflicts still came from a fundamental difference of philosophy. Who's idea was it to have Koch lead the most recent attempt at a process? Why was that supposed to make a deadlocked process somehow work?

None of this matters now. Everyone is cheerfully walking into an interoperability disaster that will cause much harm. There isn't any real chance GnuPG will lose this war, it is pretty much infrastructure at this point. But the war will cause a lot of harm to the PGP ecosystem, possibly even to the point that it becomes unusable in practice. This is an actual crisis.

Either faction can stop this. But at this point both factions are completely unreasonable and are worthy of criticism.

+1
Valodim1 month ago
ekjhgkejhgk1 month ago

Is anyone else worried that a lot of people coming from the Rust world contribute to free software and mindlessly slap on it MIT license because it's "the default license"? (Yes, I've had someone say this to me, no joke)

GnuPG for all its flaws has a copyleft license (GPL3) making it difficult to "embrace extend extinguish". If you replace it with a project that becomes more successful but has a less protective (for users) license, "we the people" might lose control of it.

Not everything in software is about features.

tazjin1 month ago

> Is anyone else worried that a lot of people coming from the Rust world contribute to free software and mindlessly slap on it MIT license

Yeah; I actually used to do that to (use the "default license"), but eventually came to the same realisation and have been moving all my projects to full copyleft.

ekjhgkejhgk1 month ago

Thank you.

UqWBcuFx6NV4r1 month ago

You are attributing a general trend to a particular language community. I also believe that you are unjustifiably unfairly interpreting “default license” just because you disagree with what they think the “default license” is. We all know what is means by this. It just sounds like you think it should be something GPL

ekjhgkejhgk1 month ago

No, you're guessing what I'm thinking. I'm telling you that a person I spoke to TOLD ME verbatim "I chose MIT because it's the default lincense". I'm not guessing that's what they did, that's what they TOLD ME. Do you understand the concept or literally telling someone something?

dkdcio1 month ago

FWIW I would absolutely say “MIT is the default license”. I also understand copyleft and personally would still choose MIT in general

I also like Rust, but the above would be true before I started using Rust (I agree it’s not a programming language thing)

joshuamorton1 month ago

The point is that this isn't unique to rust.

LexiMax1 month ago

I find that this is something reflective of most modern language ecosystems, not just Rust. I actually first started noticing the pervasiveness of MIT on npm.

For me, I am of two minds. On one hand, the fact that billion-dollar empires are built on top of what is essentially unpaid volunteer work does rankle and makes me much more appreciative of copyleft.

On the other hand, most of my hobbyist programming work has continued to be released under some form of permissive license, and this is more of a reality of the fact that I work in ecosystems where use of the GPL isn't merely inconvenient, but legally impossible, and the pragmatism of permissive licenses win out.

I do wish that weak copyleft like the Mozilla Public License had caught on as a sort of middle ground, but it seems like those licenses are rare enough to where their use would invite as much scrutiny as the GPL, even if it was technically allowed. Perhaps the FSF could have advocated more strongly for weak copyleft in area where GPL was legally barred, but I suppose they were too busy not closing the network hole in the GPLv3 to bother.

ethin1 month ago

I love the MPL and I use it wherever I get the opportunity. IMO it has all the advantages of the GPL and lacks the disadvantages (the viral part) that makes the GPL so difficult to use.

tazjin1 month ago

> where use of the GPL isn't merely inconvenient, but legally impossible

What sort of ecosystems are these?

osiris881 month ago

I used to develop free software exclusively under GPL or AGPL.

But at some point, for things like, a very small-but-useful library or utility, I had a change of heart. I felt that it's better for the project to use non-copyleft licenses.

I do this as a rule now for projects where the scope is small and the complexity of a total rewrite is not very large for several engineers at a large company.

For small stuff, the consideration is, I want people to use it, period.

When devs look at open source stuff and see MIT / Apache, they know they can use it no questions asked. When they see GPL etc. then they will be able to use it in some cases and not others depending on what they are working on. I don't want to have that friction if it's not that important.

For a lot of stuff I publish, it's really just some small thing that I tried to craft thoughtfully and now I want to give it away and hope that someone else benefits. Sometimes it gets a few million downloads and I get feedback, and I just like that experience. Often whatever the feedback is it helps me make the thing better which benefits my original use case, or I just learn things from the experience.

Often I'm not trying to build a community of developers around that project -- it's too small for that.

I still like the GPL and I have nothing against it. If I started working on something that I anticipated becoming really large somehow, I might try to make it GPL. And I feel great about contributing to large GPL projects.

I just feel like even though I'm friendly to the GPL, it's definitely no longer my default, because I tend to try to publish very small useful units. And somehow I've convinced myself that it's better for the community and for the projects themselves if those kind of things are MIT / Apache / WTFPL or similar.

I hope that makes sense.

I realized that I can be seen as one of those that treats the GPL as weird or not normal, because I don't really use it anymore. But I'm not trying to be an enemy of the GPL or enable embrace-extend-extinguish tactics. It's just that it a very nuanced thing for me I guess nowadays. Your comment caused me to reflect on this.

rockskon1 month ago

Well then the software needs to have its bugs fixed if it wants to have a chance at longer term survival.

loop221 month ago

I think that's a feature not a bug for upstream projects encouraging these rewrites.

ekjhgkejhgk1 month ago

It's harmful if the license of the rewrites if less protective of users, and then the rewrite ends up being very popular.

MobiusHorizons1 month ago

Seems like the users are voting with their feet, right? Maybe respect the users wishes and stop preaching what users should be wanting?

+1
darkwater1 month ago
+1
LtWorf1 month ago
+2
bfkwlfkjf1 month ago
amluto1 month ago

GnuPG should be extended (incrementally rewritten into something much better and turned into a library) and the original GnuPG should be extinguished.

PunchyHamster1 month ago

With UI/UX person involved in whole thing preferably. It's just... bad

Maybe have it run CLI in compatibility mode when called as `gpg` but have completely new one when called normally

rendaw1 month ago

How would MIT make anyone lose control of it?

ekjhgkejhgk1 month ago

The way it works is:

A company adopts some software with a free but not copyleft license. Adopts means they declare "this is good, we will use it".

Developers help develop the software (free of charge) and the company says thank you very much for the free labour.

Company puts that software into everything it does, and pushes it into the infrastructure of everything it does.

Some machines run that software because an individual developer put it there, other machines run that software because a company put it there, some times by exerting some sort of power for it to end up there (for example, economic incentives to vendors, like android).

A some point the company says "you know what, we like this software so much that we're going to fork it, but the fork isn't going to be free or open source. It's going to be just ours, and we're not going to share the improvements we made"

But now that software is already running in a lot of machines.

Then the company says "we're going to tweak the software a bit, so that it's no longer inter-operable with the free version. You have to install our proprietary version, or you're locked out" (out of whatever we're discussing hypothetically. Could be a network, a standard, a protocol, etc).

Developers go "shit, I guess we need to run the proprietary version now. we lost control of it."

This is what happened e.g. with chrome. There's chromium, anyone can build it. But that's not chrome. And chrome is what everybody uses because google has lock-in power. Then google says "oh I'm going to disallow you running the extensions you like, so we can show you more ads". Then they make tweaks to chrome so that websites only get rendered well if they use certain APIs, so now competitors to Chrome are forced to implement those APIs, but those aren't public.

And all of this was initially build by free labour, which google took, by people who thought they were contributing to some commons in a sense.

Copyleft licenses protect against this. Part of the license says: if you use these licenses, and you make changes to the software, you have to share the changes as well, you can't keep them for yourself".

grayhatter1 month ago

> This is what happened e.g. with chrome. There's chromium, anyone can build it. But that's not chrome. And chrome is what everybody uses because google has lock-in power.

Because Google has their attention. You can use chromium, but most people don't and pick the first thing they see. Also, Chrome is a much better name, err, not better but easier to say.

> Then google says "oh I'm going to disallow you running the extensions you like, so we can show you more ads". Then they make tweaks to chrome so that websites only get rendered well if they use certain APIs, so now competitors to Chrome are forced to implement those APIs, but those aren't public.

You and I have a different definition of "forced". But, are you speculating this might happen, or do you have an example of it happening?

> And all of this was initially build by free labour, which google took, by people who thought they were contributing to some commons in a sense.

Do you have an example of a site that works better in chrome, than it does in chromium? I'll even take an example of a site that works worse in the version of chromium before manifest v2 was disabled, compared to whatever version of chrome you choose?

> Copyleft licenses protect against this. Part of the license says: if you use these licenses, and you make changes to the software, you have to share the changes as well, you can't keep them for yourself".

Is chromium not still foss? Other than branding, what APIs or features are missing from the FOSS version? You mentioned manifest v3, but I'm using firefox because of it, so I don't find that argument too compelling. I don't think FOSS is worse, I think google is making a bad bet.

bruce5111 month ago

>> A some point the company says "you know what, we like this software so much that we're going to fork it, but the fork isn't going to be free or open source. It's going to be just ours, and we're not going to share the improvements we made"

Right. So at that point all those contributing developers are free to fork, and maintain the fork. You have just as much control as you always did.

And of course being MIT or GPL doesn't make a difference, the company is permitted to change the license either way. [1]

So here's the thing, folk are free to use the company product or not. Folk are free to fork or not.

In practice of course the company version tends to win because products need revenue to survive. And OSS has little to zero revenue. (The big revenue comes from, you know, companies who typically sell commercial software.)

Even with the outcome you hypothesize (and clearly that is a common outcome) OSS is still ahead because they have the code up to the fork. And yes, they may have contributed to earn this fork.

But projects are free to change license. That's just built into how licenses work. Assuming that something will be GPL or MIT or whatever [2] forever is on you, not them.

[1] I'm assuming CLA us in play because without that your explanation won't work.

[2] yes, I think GPL sends a signal of intention more than MIT, but it's just a social signal, it doesn't mean it can't change. Conversely making it GPL makes it harder for other developers to adopt in the first place since most are working in non-GPL environments.

josephg1 month ago

> Right. So at that point all those contributing developers are free to fork, and maintain the fork. You have just as much control as you always did.

Yep. And we've seen this happen. Eg, MariaDB forked off from MySQL. Illumos forked from Solaris. Etc. Its not a nice thing to have to do, but its hardly a doomsday situation.

miki1232111 month ago

Large parts of Chrome are actually GPL AFAIK, which is one reason both Apple and Google made it open source in the first place.

> chrome is what everybody uses because google has lock-in power.

Incorrect. At least on Windows, Chrome is not the default browser, it is the browser that most users explicitly choose to install, despite Microsoft's many suggestions to the contrary.

This is what most pro-antitrust arguments miss. Even when consumers have to go out of their way to pick Google, they still do. To me, this indicates that Google is what people actually want, but that's an inconvenient fact which doesn't fit the prevailing political narrative.

> so that websites only get rendered well if they use certain APIs, so now competitors to Chrome are forced to implement those APIs, but those aren't public.

What is a Chrome API that web developers could possibly implement but that "isn't public?" What would that even mean in this context?

> google says "oh I'm going to disallow you running the extensions you like, so we can show you more ads".

And that could have happened just as well if Chrome was 100% open source and GPL.

Even if you accept the claim that Manifest V3's primary purpose was not increasing user security at face value (and that's a tenuous claim at best), it was perfectly possible for all third-party browsers (notably including Edge, which has 0 dependency on Google's money) to fork Chromium in a way that kept old extensions working. However, open source does not mean that features will magically appear in your software. If Google is the primary maintainer and Google wishes to remove some feature, maintaining that feature in your fork requires upkeep, upkeep that most Chromium forkers were apparently unwilling to provide. This has nothing to do with whether Chrome is open source or not.

brians1 month ago

No. You can always take the MIT-licensed source. And GnuPG got used through a CLI “API” anyway.

LtWorf1 month ago

I'm not worried it might be the case. I'm certain that ubuntu and everyone else replacing gnu stuff with rust MIT stuff is done with the sole purpose of getting rid of copyleft components.

If the new components were GPL licensed there would be less opposition, but we just get called names and our opinions discarded. After all such companies have more effective marketing departments.

pseudohadamard1 month ago

Who would want to embrace, extend, and extinguish GPG?

grayhatter1 month ago

> Is anyone else worried that [...] the Rust world [...] slap on it MIT license because it's [reason you don't like]?

No... I don't think that's how software works. Do you have an example of that happening? Has any foss project lost control of the "best" version of some software?

> Not everything in software is about features.

I mean, I would happily make the argument that the ability to use code however I want without needing to give you, (the people,) permission to use my work without following my rules a feature. But then, stopping someone from using something in a way you don't like, is just another feature of GPL software too, is it not?

bfkwlfkjf1 month ago

[flagged]

grayhatter1 month ago

> You're mischaracterizing what I'm saying.

"I'm saying"? Why are you posting from multiple nonsense account names?

> For one thing you're talking about "someone" when I'm taking about "someone with power".

Are you sure you it's a mischaracterization? Or is it a disagreement over what are the important parts?

> Copyleft isn't about two people, one gaining power over the other. It's about lots of people with no power protecting themselves again one entity with a lot of power to impose themselves.

That's sounds like two parties, who disagree about what they should be allowed to do with the work of others. One side thinks they should be able to control the behavior and actions of the other, and the other disagrees they should have any say over how they act. In that example which side is the GPL, and which side do you think I believe is more free?

> Are you new to HN?

Brand new!

> Every month there's news of projects trying to arrest power contributors using various shenanigans. Copyleft protects against a class of such attacks.

Then you should have specific examples you can describe and or cite?

> Eg Oracle and open office, red hat and centos.

those are names of companies, not examples of embrace, extend, extinguish... which is the FUD you started with?

the reddit post isn't linux losing control, is it? They made an insulin pump, and used linux... did linux lose control over anything? Is the best version of Linux on that insulin pump? Given it appears to be killing patients, I'm gonna guess it's not the best version, and the best version is still what I'm gonna call "mainline".

I restrict myself to foss software as much as I can. Because I want to be able to modify, and hack on the stuff I use. I also strongly support right to repair laws. But I'm unwilling to force my opinions on others. If you want to make something, and keep it secret, and set rules about how I'm allowed to use it, that's reasonable. I'm gonna tell you no, and suggest you piss off. Then I'll find or make something to replace it. I've never seen other person doing things, as preventing me from doing it myself or my way. And haven't found an example of it happening, other than people saying, I should be able to take what you made, and use it how I want without asking you for permission.

Either you believe 1) others should be able to set rules related to how they are allowed to use your work, or 2) you don't support the GPL

I guess there is a secret third option where you believe that you should be able to make up rules, but no one else should.

commandersaki1 month ago

Not really, gpg isn't something worth losing.

sph1 month ago

The vast majority of open-source software is written by people whose day job is building empires on top other open-source software, at zero cost and without releasing modifications, which is harder to do with the GPL.

LtWorf1 month ago

Which is why I use copyleft licenses when I'm not getting paid

somethrowa1231 month ago

the writeup is now available and the recording lives at https://media.ccc.de/v/39c3-to-sign-or-not-to-sign-practical...

tptacek1 month ago

A thru-line of some of the gnarliest vulnerabilities here is PGP's insane packet system, where a PGP message is a practically arbitrary stream of packets, some control and some data, with totally incoherent cryptographic bindings. It's like something in between XMLDSIG (which pulls cryptographic control data out of random places in XML messages according to attacker-controlled tags) and SSL2 (with no coherent authentication of the complete handshake).

The attack on detached signatures (attack #1) happens because GnuPG needs to run a complicated state machine that can put processing into multiple different modes, among them three different styles of message signature. In GPG, that whole state machine apparently collapses down to a binary check of "did we see any data so that we'd need to verify a signature?", and you can selectively flip that predicate back and forth by shoving different packets into message stream, even if you've already sent data that needs to be verified.

The malleability bug (attack #4) is particularly slick. Again, it's an incoherent state machine issue. GPG can "fail" to process a packet because it's cryptographically invalid. But it can also fail because the message framing itself is corrupted. Those latter non-cryptographic failures are handled by aborting the processing of the message, putting GPG into an unexpected state where it's handling an error and "forgetting" to check the message authenticator. You can CBC-bitflip known headers to force GPG into processing DEFLATE compression, and mangle the message such that handling the message prints the plaintext in its output.

The formfeed bug (#3) is downright weird. GnuPG has special handling for `\f`; if it occurs at the end of a line, you can inject arbitrary unsigned data, because of GnuPG's handling of line truncation. Why is this even a feature?

Some of these attacks look situational, but that's deceptive, because PGP is (especially in older jankier systems) used as an encryption backend for applications --- Mallory getting Alice to sign or encrypt something on her behalf is an extremely realistic threat model (it's the same threat model as most cryptographic attacks on secure cookies: the app automatically signs stuff for users).

There is no reason for a message encryption system to have this kind of complexity. It's a deep architectural flaw in PGP. You want extremely simple, orthogonal features in the format, ideally treating everything as clearly length-delimited opaque binary blobs. Instead you get a Weird Machine, and talks like this one.

Amazing work.

oskarw851 month ago

Thank you for this excellent explanation!

ahazred8ta1 month ago

Part of the problem is that the gnupg maintainers have a longstanding policy of being compatible with every. single. version. of every PGP program's input and output formats, including pkz's early 1990s shareware and even a bunch of IETF prototype formats that never got adopted. It's layer upon layer of special cases.

singpolyma31 month ago

AFAICT this is GnuPG specific and not OpenPGP related? Since GnuPG has pulled out of standards compliance anyway there are many better options. Sequoia chameleon even has drop in tooling for most workflows.

rurban1 month ago

They presented critical parser flaws in all major PGP implementations, not just GNU PGP, also sequoia, minisign and age. But gpg made the worst impression to us. wontfix

pornel1 month ago

Sequoia is mentioned in only one vulnerability for supporting lines much longer than gpg. gpg silently truncates and discards long base64 lines and sequoia does not. So the vulnerability is in ability to feed more data to sequoia which doesn't have the silent data loss of gpg.

In all other cases they only used sequoia as a tool to build data for demonstrating gpg vulnerabilities.

tptacek1 month ago

The vulnerability that opens the talk, where they walk through verifying a Linux ISO's signature and hash and then boot into a malicious image, impacts both GnuPG and Sequoia.

akerl_1 month ago

Since when are age or minisign PGP implementations?

pornel1 month ago

They're not, but the flaws they found are independent of PGP. Mainly invalid handling of strings in C and allowing untrusted ANSI codes in terminal output.

some_furry1 month ago

The talk title includes "& Friends", for what it's worth.

Analemma_1 month ago

The specific bugs are with GPG, but a lot of the reason they can exist to begin with is PGP’s convoluted architecture which, IMO, makes these sorts of issues inevitable. I think they are effectively protocol bugs.

upofadown1 month ago

I think it would be more accurate (and more helpful) to say that the two factions in the OpenPGP standards schism[1] have pulled away from the idea of consensus. There is a fundamental philosophical difference here. The LiberePGP faction (GnuPGP) is following the traditional PGP minimalism when it comes to changes and additions to the standard. The RFC-9580 faction (Sequoia) is following a kind of maximalist approach where any potential issue might result in a change/addition.

Fortunately, it turned out that there wasn't anything particularly wrong with the current standards so we can just do that for now and avoid the standards war entirely. Then we will have interoperability across the various implementations. If some weakness comes up that actually requires a standards change then I suspect that consensus will be much easier to find.

[1] https://articles.59.ca/doku.php?id=pgpfan:schism

tptacek1 month ago

I'm sure getting a "nothing's particularly wrong with the current standards" vibe from this talk.

upofadown1 month ago

Some of these are suggesting that an attacker might trick the victim into decrypting a message before sending to the attacker. If that is really the best sort of attack you can do against PGP then, yeah, that is the kind of vibe you might get.

singpolyma31 month ago

The talk doesn't even cover anything from the current afaict

tptacek1 month ago

I believe that's incorrect but we may be referring to different things as "current".

somethrowa1231 month ago

no, some clearsig issues are a problem in openpgp standard itself

elric1 month ago

This is depressing.

From what I can piece together while the site is down, it seems like they've uncovered 14 exploitable vulnerabilities in GnuPG, of which most remain unpatched. Some of those are apparently met by refusal to patch by the maintainer. Maybe there are good reasons for this refusal, maybe someone else can chime in on that?

Is this another case of XKCD-2347? Or is there something else going on? Pretty much every Linux distro depends on PGP being pretty secure. Surely IBM & co have a couple of spare developers or spare cash to contribute?

akerl_1 month ago

> Surely IBM & co have a couple of spare developers or spare cash to contribute?

A major part of the problem is that GPG’s issues aren’t cash or developer time. It’s fundamentally a bad design for cryptographic usage. It’s so busy trying to be a generic Swiss Army knife for every possible user or use case that it’s basically made of developer and user footguns.

The way you secure this is by moving to alternative, purpose-built tools. Signal/WhatsApp for messaging, age for file encryption, minisign for signatures, etc.

ameliaquining1 month ago

If by "pretty much every Linux distro depends on PGP being pretty secure" you're referring to its use to sign packages in Linux package managers, it's worth noting that they use PGP in fairly narrowly constrained ways; in particular, the data is often already trusted because it was downloaded over HTTPS from a trusted server (making PGP kind of redundant in some ways). So most PGP vulnerabilities don't affect them.

If there were a PGP vulnerability that actually made it possible to push unauthorized updates to RHEL or Fedora systems, then probably IBM would care, but if they concluded that PGP's security problems were a serious threat then I suspect they'd be more likely to start a migration away from PGP than to start investing in making PGP secure; the former seems more tractable and would have maintainability benefits besides.

viraptor1 month ago

> already trusted because it was downloaded over HTTPS from a trusted server (making PGP kind of redundant in some ways)

That's mostly incorrect in both counts. One is that lots of mirrors are still http-only or http default https://launchpad.net/ubuntu/+archivemirrors

The other is that if you get access to one of the mirrors and replace a package, it's the signature that stops you. Https is only relevant for mitm attacks.

> they'd be more likely to start a migration away from PGP

The discussions started ages ago:

Debian https://wiki.debian.org/Teams/Apt/Spec/AptSign

Fedora https://lists.fedoraproject.org/archives/list/packaging@list...

Avamander1 month ago

Debian and most Debian derivatives have HTTP-only mirrors. Which I've found absolutely crazy for years. Though nobody seems to care. Maybe it'll change this time around.

Though this type of juggling knives is not unique to Linux. AMD and many other hardware vendors ship executables over unencrypted connections for Windows. All just hoping that not a single similar vulnerability or confusion can be found.

xorcist1 month ago

That is not an accurate description.

Debian, and indeed most projects, do not control the download servers you use. This is why security is end-to-end where packages are signed at creation and verified at installation, the actual files can then pass through several untrusted servers and proxies. This was sound design in the 90s and is sound design today.

zzo38computer1 month ago

Downloading over HTTPS does not help with that (although it can prevent spies from seeing what files you are downloading) unless you can independently verify the server's keys. The certificate is intended to do this but the way that standard certificate authorities work will only verify the domain name, and has some other limitations. TLS does have other benefits, but it does a different thing. Using only TLS to verify the packages is not very good, especially with the existing public certificate authorities.

If you only need a specific version and you already know what that one is, then using a cryptographic hash will be a better way to verify packages, although that only applies for one specific version of one specific package. So, using an encrypted protocol (HTTPS or any other one) alone will not help, although it will help in combination with other things; you will need to do other things as well, to improve the security.

collinfunk1 month ago

Haven't read it since it is down, but based on other comments, it seems to be an issue with cleartext signatures.

I haven't seen those outside of old mailing list archives. Everyone uses detached signatures nowadays, e.g. PGP/MIME for emails.

bytehamster1 month ago

If I understood their first demo correctly, they verified a fedora iso with a detached signature. The booted iso then printed "hello 39c3". https://streaming.media.ccc.de/39c3/relive/1854

unscaled1 month ago

It was a cleartext signature, not a detached signature.

Edit: even better. It was both. There is a signature type confusion attack going on here. I still didn't watch the entire thing, but it seems that unlike gpg, they do have to specify --cleartext explicitly for Sequoia, so there is no confusion going on that case.

sorz1 month ago

Lots of issues follow the pattern "ANSI escape code inside untrusted text". It feels like XSS but for terminal.

KooBaa1 month ago

The 12 vulnerabilities mentioned in “gpg fail” are somewhat exaggerated.

Here you can find a reply from GnuPG: https://www.openwall.com/lists/oss-security/2025/12/29/9

And btw, it was mentioned in the talk that GnuPG does not sign commits. That’s just wrong. Everything, including the release tarballs, is signed.

wkat42421 month ago

I don't mind gpg. I still use it a lot especially with the private keys on openpgp smartcards or yubikeys.

It's a pretty great ecosystem, most hardware smartcards are surrounded by a lot of black magic and secret handshakes and stuff like pkcs#11 and opensc/openct are much much harder to configure.

I use it for many things but not for email. Encrypted backups, password manager, ssh keys. For some there are other hardware options like fido2 but not for all usecases and not the same one for each usecase. So I expect to be using gpg for a long time to come.

SEJeff1 month ago

One of my coworkers, Liam, gave this talk. If you like this and want to work with like minded individuals, apply to some of our seceng roles:

https://www.asymmetric.re/careers

raphinou1 month ago

I'm working on a multi sig file authentication solution based on minisign. Anyone knows the response of the dev regarding minisign's listed vulnerability? If I'm not mistaken, the response of the authors are not included in the vulnerabilities' descriptions.

jedisct11 month ago

Because the authors found out about it by chance on Hacker News.

That said, these issues are not a big deal.

The first one concerns someone manually reading a signature with cat (which is completely untrusted at that stage, since nothing has been verified), then using the actual tool meant to parse it, and ignoring that tool’s output. cat is a different tool from minisign.

If you manually cat a file, it can contain arbitrary characters, not just in the specific location this report focuses on, but anywhere in the file.

The second issue is about trusting an untrusted signer who could include control characters in a comment.

In that case, a malicious signer could just make the signed file itself malicious as well, so you shouldn’t trust them in the first place.

Still, it’s worth fixing. In the Zig implementation of minisign, these characters are escaped when printed. In the C implementation, invalid strings are now rejected at load time.

acoustics1 month ago

I don't understand the disappointment expressed here in the maintainers deciding to WONTFIX these security bugs.

Isn't this what ffmpeg did recently? They seemed to get a ton of community support in their decision not to fix a vulnerability

some_furry1 month ago

ffmpeg doesn't have a cargo-cult of self-proclaimed "privacy experts" that tell activists and whistleblowers to use their thing instead of other tools cryptographers actually recommend.

landr0id1 month ago

Yeah, instead they have a cargo-cult of self-proclaimed OSS contribution experts who harass anyone that critiques or challenges ffmpeg's twitter account.

selfbottle1 month ago

writeups are online :))

theshrike791 month ago

Could someone rewrite GPG in Rust please?

IAmLiterallyAB1 month ago
upofadown1 month ago

There is some misleading stuff in that article. To save time I made an article to provide my commentary:

* https://articles.59.ca/doku.php?id=pgpfan:tpp

jcranmer1 month ago

Don't you think it's time to update it, given you start by saying that "If someone, while trying to sell you some high security mechanical system, told you that the system had remained unbreached for the last 20 years you would take that as a compelling argument"?

Because you're clearly presenting it as a defense of PGP on a thread from a presentation clearly delineating breaks in it using exactly the kind of complexity that the article you're responding to predicts would cause it to break.

upofadown1 month ago

The mechanical analogy is particularly interesting here because at least one of the claimed vulnerabilities involves tricking the victim into decrypting an encrypted message for the attacker and then sending it to them. If someone can be tricked into opening a safe to let the burgler rummage around inside then few would consider that a failure of the safe technology. I mean there is still a problem there but it is a different one.

I think this supports my contention that we spend much too much time quibbling about cryptographic trivialities when it comes to end to end encrypted messaging. We should spend more time on the usability of such systems.

tptacek1 month ago

The constraint that you have implicitly applied to cryptosystems forecloses on using GPG as a base layer in other computing systems; in your view, GPG is a "safe", which can only be opened by the owner of the contents to retrieve and remove those contents.

GaryBluto1 month ago

> brb, were on it!!!!

user39393821 month ago

If mass use of GPG benefited Microsoft, Amazon, Google and all the other assholes it would be polished, slick, and part of 9th grade curriculum. They call it “Face ID” that’s the Orwellian shit that makes money so that’s what we get instead. These things take resources, don’t blame the projects.

WesolyKubeczek1 month ago

gpg.fail fail: "brb, we're on it!"

_haxx0rz1 month ago

hug of death?

rurban1 month ago

Nope. Not yet enabled. It was submitted to HN right after the talk where they promised to make it public "really soon" after the talk. We all saw the talk live or on the stream

clacker-o-matic1 month ago

its back up!

13171 month ago

[video]

annalexandra1 month ago

[dead]

manuchojose761 month ago

[dead]

cindyllm1 month ago

[dead]

ekjhgkejhgk1 month ago

[flagged]