Backdoor in upstream xz/liblzma

This has been ranking very high on HN since yesterday, 29 Mar:

I’m not too deep into IT security, but this seems to be pretty severe, not just for Linux but open source software in general. As an open source advocate and Linux user I was wondering if and how a simple and concise language such as Zig might help against that problem? I doubt that it can be totally avoided - especially if trust is built over such a long time as in this example - but could it at least become significantly harder? Excited to read your opinions on this.


My understanding is that the backdoor was installed via a committed binary obj that was placed under the guise of testing infrastructure. Kinda clever, as OSS says it’s security is from many eyes viewing the source, but in this case the source was cleverly obfuscated.

Some takeaways I have is that anything that isn’t plain text is suspect. Have scripts that generate blobs.

Having a way to monitor build outputs would be important, but I’m not sure what that would look like. The backdoor was discovered in performance regressions in openssh by a user.

Unfortunately it will be a mark against Open Source and you will see more calls for regulation of OSS, or at least the use of it.


Sure, the actual backdoor code was implemented via a binary test file. But that still requires extraction during the build process - which was rigged by stuff like this (spotted the “punctuation mistake”?). That reminds me of Python’s eval (which should never be allowed in production code). I’m no C developer though, so I can’t judge how common this is in C projects.

1 Like

This sort of BS is very common in C projects, yes.

Zig steers as far away from stuff like this as possible. No pre-processor, no hidden control flow, etc. That of course doesn’t mean it’s impossible. But I’d hope that it’s a lot more difficult to pass off something malicious as “ugly but necessary”.


There two very good blog posts from Russ Cox:

What Andres Freund found by luck was a multiyear effort to deploy a backdoor giving remote code execution on every computer in the universe.


The problem, IMHO, is that the authors/maintainers started a hobby/personal/company project but when that project become famous, there is a lot of pressure and no incentives.


in case you didn’t read this already:

after re-visiting this blog post, I thought I might as well share it… no matter how impressive the technical side of this attack is, the part that really frightens me is the social engineering side!


There may be process and mechanical steps to help improve guards and protections. BUT reading that timeline by Russ Cox, Lasse Collin admits (2022-06-08) that real life events and mental health imposed on their time to dedicate to the maintainer role for XZ. We really need to support software that we rely on and attempt to give the maintainer(s) the ability to devote the time that is otherwise taken by their “day job”. So if there is a lesson here, it’s got to be that we chip in when we can and “tip your waiter”. ( also I also just watched the John Oliver piece about food delivery apps)


The malicious code was actually only in the tarball, not the repo.

There’s an applicable meme here:


I haven’t read all the blog posts or seen all the videos, but my impression is that portraying this incident as an “Open Source” problem isn’t fair. The backdoor was implanted precisely via a closed source mechanism (crafted xz file and test binary) and kept hidden via another closed source mechanism (convenience binary tarballs of pre-built open source project.) So if this all were truly open source code end-to-end, it would have been really hard to pull off.

1 Like

Challenge: hide an exploit but by hiding an implementation/interpreter + program for a variant of Whitespace (programming language) - Wikipedia.

Then have the exploit be defeated by auto formatters :wink:

1 Like

That’s not quite what most people agree open source is, but I get your point.

Malicious code was checked into the repo embedded into a binary test file. What was not checked into the repo and existed only in the tarball was the generated m4 file and modified config script that extracted and applied the backdoor during the installation.