175
submitted 4 days ago* (last edited 4 days ago) by K3can@lemmy.radio to c/selfhosted@lemmy.world

I set up a quick demonstration to show risks of curl|bash and how a bad-actor could potentially hide a malicious script that appears safe.

It's nothing new or groundbreaking, but I figure it never hurts to have another reminder.

top 50 comments
sorted by: hot top controversial new old
[-] osanna@thebrainbin.org 63 points 4 days ago

you’d have to be mad to willingly pipe a script to bash without checking it. holy shit

[-] wildbus8979@sh.itjust.works 30 points 4 days ago

And you better inspect and execute a downloaded copy, because a malicious actor can serve a different file for curl/wget than to your browser

[-] Flipper@feddit.org 19 points 4 days ago

They can even serve a different file for curl vs curl|bash

[-] wildbus8979@sh.itjust.works 8 points 4 days ago

Yeah that do, I remember that the demo was pretty impressive ten fifteen years ago!

[-] deadbeef79000@lemmy.nz 7 points 4 days ago

Does curl send a different useragent when it's piped?

Searching for those words just vomits 'hOW to SeT cUrL's UseRaGenT' blog spam.

[-] Flipper@feddit.org 19 points 4 days ago* (last edited 4 days ago)

Its timing based. When piped a script, bash executes each line completly before taking the next line from the input. Curl has a limited output buffer.

  1. Operation that takes a long time. Like a sleep, or if you want it less obvious. A download, an unzip operation, apt update, etc.
  2. Fill the buffer with more bash commands.
  3. Measure on the server if at some point curl stops downloading the script.
  4. Serve a malicious payload.
[-] deadbeef79000@lemmy.nz 3 points 4 days ago

Oh that is clever.

[-] qupada@fedia.io 5 points 4 days ago

Not that I know of, which means I can only assume it'll be a timing-based attack.

With strategic use of sleep statements in the script you should stand a pretty good chance of detecting the HTTP download blocking while the script execution is paused.

If you were already shipping the kind of script that unpacks a binary payload from the tail end of the file and executes it, it's well within the realm of possibility to swap it for a different one.

[-] K3can@lemmy.radio 10 points 4 days ago* (last edited 4 days ago)

Yep! That's what the post shows.

I created a live demo file, too, so that you can actually see the difference based on how you request the file.

load more comments (1 replies)
[-] Dave@lemmy.nz 21 points 4 days ago

Is it different from running a bash script you downloaded without checking it? E.g. the installer that you get with GOG games?

Genuine question, I'm no expert.

[-] osanna@thebrainbin.org 12 points 4 days ago

I have no problems with running scripts from the internet, AFTER you check them. Do NOT blindly run a script you found on the internet. As others have said download them, then check them, then and only then run them if they're safe. NEVER pipe to bash, ever.

[-] Dave@lemmy.nz 12 points 4 days ago

Ok but not everyone has that skill. And anyway, how is this different to running a binary where you can't check the code?

[-] osanna@thebrainbin.org 12 points 4 days ago

it's exactly the same. Don't run binaries you don't trust fully. But i get what you mean. miley_cyrus_nude.jpg.exe is probably gonna end badly.

[-] Dave@lemmy.nz 9 points 4 days ago

Yeah I get that, but I would install docker, cloudflared, etc by piping a convenience script to bash without hesitation. I've already decided to install their binary, I don't see why the install script is any higher risk.

I know it's a controversial thing for everyone to make their own call on, I just don't think the risk for a bash script is any higher than a binary.

[-] moonpiedumplings@programming.dev 9 points 4 days ago* (last edited 4 days ago)

I won't lie, I use curl | bash as well, but I do dislike it for two reasons:

Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.

On the other hand, websites hosting infrastructure is generally nowhere near as secure. It's typically one or two VPS's, and there is no signature or verification that the content is "official". So even if I'm not tampering with the binary, I can still tamper with the bash script to add extra goodies to it.

On the other hand (but not really relevant to what OP is talking about), just because I trust someone to give me a binary in a mature programming language they have experience writing in, doesn't mean I trust them to give me a script in a language known for footguns. A steam bug in their bash script once deleted a user's home directory. There have also been issues with AUR packages, which are basically bash scripts, breaking people's systems as well. When it comes to user/community created scripts, I mostly trust them to not be malicious, and I am more fearful of a bug or mistake screwing things up. But at the same time, I have little confidence in my ability to spot these bugs.

Generally, I only make an exception for running bash installers if the program being installed is a "platform" that I can use to install more software. K3s (Kubernetes distro), or the Nix package manager are examples. If I can install something via Nix or Docker then it's going to be installed via there and not installed via curl | bash. Not every developer under the sun should be given the privilege of running a bash script on my system.

As a sidenote, docker doesn't recommend their install script anymore. All the instructions have been removed from the website, and they recommend adding their own repo's instead. Personally, I prefer to get it from the distro's repositories, as usually that's the simplest and fastest way to install docker nowadays.

load more comments (1 replies)
load more comments (2 replies)
[-] surewhynotlem@lemmy.world 5 points 4 days ago

It's really only about trusting the source. Your operating system surely has thousands of scripts that you've never read and never checked. And wouldn't have time to. And people don't complain about that.

But it's really bad practice to run random things from random sites. So the practice of downloading a script and running it is frowned upon. Mostly as a way of maintaining good security hygiene.

load more comments (2 replies)
[-] cecilkorik@piefed.ca 20 points 4 days ago

And it's wild how much even that has been absolutely normalized by all these shitty lazy developers and platforms. Vibe coding it just going to make it worse. All these programs that look nice on the surface and are just slop on the inside. It's going to be a mess.

[-] jtrek@startrek.website 17 points 4 days ago

Most developers I've looked at would happily just paste the curl|bash thing into the terminal.

I often would skim the script in the browser, but a. This post shows that's not fool proof and b. a sufficiently sophisticated malicious script would fool a casual read

[-] Ephera@lemmy.ml 6 points 4 days ago

Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.

I mean, I typically see it used for installing applications, and so long as TLS is used for the download, I'm still not aware of a good reason why you should check the Bash script in particular in that case, since the application itself could just as well be malware.

Of course, it's better to check the Bash script than to not check it, but at that point we should also advise to download the source code for the application, review it and then compile it yourself.
At some point, you just have to bite the bullet and I have not yet seen a good argument why the Bash script deserves special treatment here...

Having said that, for cases where you're not installing an application, yeah, reviewing the script allows you to use it, without having to trust the source to the same degree as you do for installing an application.

[-] BluescreenOfDeath@lemmy.world 17 points 4 days ago

The post is specifically about how you can serve a totally different script than the one you inspect. If you use curl to fetch the script via terminal, the webserver can send a different script to a browser based on the UserAgent.

And whether or not you think someone would be mad to do it, it's still a widespread practice. The article mentions that piping curl straight to bash is already standard procedure for Proxmox helper scripts. But don't take anyone's word for it, check it out:

https://community-scripts.github.io/ProxmoxVE/

It's also the recommended method for PiHole:

https://docs.pi-hole.net/main/basic-install/

[-] mrnobody@reddthat.com 8 points 4 days ago

The reality is a lot of newcomers to Linux won't even understand the risks involved, it's run because that's what they're told or shown to do. That's what I did for pihole many years ago too, I'll admit

[-] atzanteol@sh.itjust.works 7 points 4 days ago

I've been accused of "gate keeping" when I tell people that this is a shitty way to deploy applications and that nobody should do it.

[-] BluescreenOfDeath@lemmy.world 2 points 3 days ago

Users are blameless, I find the fault with the developers.

Asking users to pipe curl to bash because it's easier for the developer is just the developer being lazy, IMO.

Developers wouldn't get a free pass for taking lazy, insecure shortcuts in programming, I don't know why they should get a free pass on this.

load more comments (2 replies)
[-] mlg@lemmy.world 8 points 3 days ago

Use our easy bash oneliner to install our software!

Looks inside script

if [ $(command -v apt-get) ]; then apt-get install app; else echo "Unsupported OS"

Still less annoying than trying to build something from source in which the dev claims has like 3 dependencies but in reality requires 500mb of random packages you've never even heard of, all while their build system doesn't do any pre comp checking so the build fails after a solid hours of compilation.

[-] xylogx@lemmy.world 13 points 4 days ago

Yes this has risks. At the same time anytime you run any piece of software you are facing the same risks, especially if that software is updated from the internet. Take a look at the NIST docs in software supply chain risks.

[-] ShortN0te@lemmy.ml 9 points 4 days ago

Not completely correct. A lot of updaters work with signatures to verify that what was downloaded is signed by the correct key.

With bash curl there is no such check in place.

So strictly speeking it is not the same.

load more comments (11 replies)
[-] axx@slrpnk.net 4 points 4 days ago

This is a bit like saying crossing the street blindfolded while juggling chainsaws and crossing the street on a pedestrian crossing while the light is red for cars both carry risk. Sure. One's a terrible idea though.

[-] Nibodhika@lemmy.world 3 points 3 days ago

But those are two very different things, I can very easily give you a one liner using curl|bash that will compromise your system, to get the same level of compromise through a proper authenticated channel such as apt/pacman/etc you would need to compromise either their private keys and attack before they notice and change them or stick malicious code in an official package, either of those is orders of magnitude more difficult than writing a simple bash script.

load more comments (2 replies)
load more comments (1 replies)
[-] MehBlah@lemmy.world 7 points 4 days ago

Never have I ever piped curl to bash.

[-] ikidd@lemmy.world 8 points 4 days ago

Oh, people will keep using it no matter how much you warn them.

Proxmox-helper-scripts is a perfect example. They'll agree with you until that site comes up, and then its "it'll never, ever get hacked and subverted, nope, can't happen, impossible".

Wankers.

[-] krispyavuz@lemmy.world 9 points 4 days ago

Curl bash is no different than running an sh script you dont know manually…

[-] K3can@lemmy.radio 7 points 4 days ago

True, but this is specifically about scripts you think you know, and how curl bash might trick you into running a different script entirely.

load more comments (1 replies)
[-] ssfckdt 3 points 3 days ago

I'm a bit lost with

a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it. In the

You... You just.... You just dump the curl output to file and examine that and then run it if its good

Just a weird imagined sequence to me.

[-] martini1992@lemmy.ml 3 points 3 days ago

Worse than that, the server can change it's response based on user agent so you need to curl it to a file first, a browser could be served a completely different response.

[-] K3can@lemmy.radio 3 points 3 days ago

Which is exactly what is demonstrated in the post. 🙃

[-] quick_snail@feddit.nl 6 points 4 days ago

Anytime I see a project that had this in their install instructions, I don't use that project.

It shows how dumb the devs are

load more comments (1 replies)
[-] Buddahriffic@lemmy.world 3 points 3 days ago

An alternative that will avoid the user agent trick is to curl | cat, which just prints the result of the first command to the console. curl >> filename.sh will write it to a script file that you can review and then mark executable and run if you deem it safe, which is safer than doing a curl | cat followed by a curl | bash (because it's still possible for the 2nd curl to return a different set of commands).

You can control the user agent with curl and spoof a browser's user agent for one fetch, then a second fetch using the normal curl user agent and compare the results to detect malicious urls in an automated way.

A command line analyzer tool would be nice for people who aren't as familiar with the commands (and to defeat obfuscation) and arguments, though I believe the problem is NP, so it won't likely ever be completely foolproof. Though maybe it can be if it is run in a sandbox to see what it does instead of just analyzed.

[-] Decronym@lemmy.decronym.xyz 6 points 4 days ago* (last edited 2 days ago)

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:

Fewer Letters More Letters
DNS Domain Name Service/System
HTTP Hypertext Transfer Protocol, the Web
PiHole Network-wide ad-blocker (DNS sinkhole)
SSL Secure Sockets Layer, for transparent encryption
TLS Transport Layer Security, supersedes SSL
VPS Virtual Private Server (opposed to shared hosting)

5 acronyms in this thread; the most compressed thread commented on today has 12 acronyms.

[Thread #111 for this comm, first seen 23rd Feb 2026, 04:40] [FAQ] [Full list] [Contact] [Source code]

load more comments (1 replies)
[-] neidu3@sh.itjust.works 2 points 3 days ago* (last edited 3 days ago)

Running arbitrary text from the internet through an interpreter.. what could possibly go wrong.

I need to set up a website with

fork while 1

...Just so I can (try to) convince people to

curl | perl

it

...rhyme intended.

[-] sturmblast@lemmy.world 3 points 4 days ago

You mean blindly running code is bad? /s

[-] quick_snail@feddit.nl 3 points 4 days ago

a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it.

Wow, I never thought anyone would be that dumb.

Why wouldn't they just wget it, read it, and then execute it?

[-] axx@slrpnk.net 5 points 4 days ago* (last edited 4 days ago)

Oh the example in the article is the nice version if this attack.

Checking the script as downloaded by wget or curl and then piping curl to bash is still a terrible idea, as you have no guarantee you'll get the same script in both cases:

load more comments
view more: next ›
this post was submitted on 23 Feb 2026
175 points (100.0% liked)

Selfhosted

56958 readers
760 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS