you’d have to be mad to willingly pipe a script to bash without checking it. holy shit
And you better inspect and execute a downloaded copy, because a malicious actor can serve a different file for curl/wget than to your browser
They can even serve a different file for curl vs curl|bash
Yeah that do, I remember that the demo was pretty impressive ten fifteen years ago!
Does curl send a different useragent when it's piped?
Searching for those words just vomits 'hOW to SeT cUrL's UseRaGenT' blog spam.
Its timing based. When piped a script, bash executes each line completly before taking the next line from the input. Curl has a limited output buffer.
- Operation that takes a long time. Like a sleep, or if you want it less obvious. A download, an unzip operation, apt update, etc.
- Fill the buffer with more bash commands.
- Measure on the server if at some point curl stops downloading the script.
- Serve a malicious payload.
Oh that is clever.
Not that I know of, which means I can only assume it'll be a timing-based attack.
With strategic use of sleep statements in the script you should stand a pretty good chance of detecting the HTTP download blocking while the script execution is paused.
If you were already shipping the kind of script that unpacks a binary payload from the tail end of the file and executes it, it's well within the realm of possibility to swap it for a different one.
Yep! That's what the post shows.
I created a live demo file, too, so that you can actually see the difference based on how you request the file.
Is it different from running a bash script you downloaded without checking it? E.g. the installer that you get with GOG games?
Genuine question, I'm no expert.
I have no problems with running scripts from the internet, AFTER you check them. Do NOT blindly run a script you found on the internet. As others have said download them, then check them, then and only then run them if they're safe. NEVER pipe to bash, ever.
Ok but not everyone has that skill. And anyway, how is this different to running a binary where you can't check the code?
it's exactly the same. Don't run binaries you don't trust fully. But i get what you mean. miley_cyrus_nude.jpg.exe is probably gonna end badly.
Yeah I get that, but I would install docker, cloudflared, etc by piping a convenience script to bash without hesitation. I've already decided to install their binary, I don't see why the install script is any higher risk.
I know it's a controversial thing for everyone to make their own call on, I just don't think the risk for a bash script is any higher than a binary.
I won't lie, I use curl | bash as well, but I do dislike it for two reasons:
Firstly, it is much, much easier to compromise the website hosting than the binary itself, usually. Distributed binaries are usually signed by multiple keys from multiple servers, resulting in them being highly resistant to tampering. Reproducible builds (two users compiling a program get the same output) make it trivial to detect tampering as well.
On the other hand, websites hosting infrastructure is generally nowhere near as secure. It's typically one or two VPS's, and there is no signature or verification that the content is "official". So even if I'm not tampering with the binary, I can still tamper with the bash script to add extra goodies to it.
On the other hand (but not really relevant to what OP is talking about), just because I trust someone to give me a binary in a mature programming language they have experience writing in, doesn't mean I trust them to give me a script in a language known for footguns. A steam bug in their bash script once deleted a user's home directory. There have also been issues with AUR packages, which are basically bash scripts, breaking people's systems as well. When it comes to user/community created scripts, I mostly trust them to not be malicious, and I am more fearful of a bug or mistake screwing things up. But at the same time, I have little confidence in my ability to spot these bugs.
Generally, I only make an exception for running bash installers if the program being installed is a "platform" that I can use to install more software. K3s (Kubernetes distro), or the Nix package manager are examples. If I can install something via Nix or Docker then it's going to be installed via there and not installed via curl | bash. Not every developer under the sun should be given the privilege of running a bash script on my system.
As a sidenote, docker doesn't recommend their install script anymore. All the instructions have been removed from the website, and they recommend adding their own repo's instead. Personally, I prefer to get it from the distro's repositories, as usually that's the simplest and fastest way to install docker nowadays.
It's really only about trusting the source. Your operating system surely has thousands of scripts that you've never read and never checked. And wouldn't have time to. And people don't complain about that.
But it's really bad practice to run random things from random sites. So the practice of downloading a script and running it is frowned upon. Mostly as a way of maintaining good security hygiene.
And it's wild how much even that has been absolutely normalized by all these shitty lazy developers and platforms. Vibe coding it just going to make it worse. All these programs that look nice on the surface and are just slop on the inside. It's going to be a mess.
Most developers I've looked at would happily just paste the curl|bash thing into the terminal.
I often would skim the script in the browser, but a. This post shows that's not fool proof and b. a sufficiently sophisticated malicious script would fool a casual read
Most developers I’ve looked at would happily just paste the curl|bash thing into the terminal.
I mean, I typically see it used for installing applications, and so long as TLS is used for the download, I'm still not aware of a good reason why you should check the Bash script in particular in that case, since the application itself could just as well be malware.
Of course, it's better to check the Bash script than to not check it, but at that point we should also advise to download the source code for the application, review it and then compile it yourself.
At some point, you just have to bite the bullet and I have not yet seen a good argument why the Bash script deserves special treatment here...
Having said that, for cases where you're not installing an application, yeah, reviewing the script allows you to use it, without having to trust the source to the same degree as you do for installing an application.
The post is specifically about how you can serve a totally different script than the one you inspect. If you use curl to fetch the script via terminal, the webserver can send a different script to a browser based on the UserAgent.
And whether or not you think someone would be mad to do it, it's still a widespread practice. The article mentions that piping curl straight to bash is already standard procedure for Proxmox helper scripts. But don't take anyone's word for it, check it out:
https://community-scripts.github.io/ProxmoxVE/
It's also the recommended method for PiHole:
The reality is a lot of newcomers to Linux won't even understand the risks involved, it's run because that's what they're told or shown to do. That's what I did for pihole many years ago too, I'll admit
I've been accused of "gate keeping" when I tell people that this is a shitty way to deploy applications and that nobody should do it.
Users are blameless, I find the fault with the developers.
Asking users to pipe curl to bash because it's easier for the developer is just the developer being lazy, IMO.
Developers wouldn't get a free pass for taking lazy, insecure shortcuts in programming, I don't know why they should get a free pass on this.
Use our easy bash oneliner to install our software!
Looks inside script
if [ $(command -v apt-get) ]; then apt-get install app; else echo "Unsupported OS"
Still less annoying than trying to build something from source in which the dev claims has like 3 dependencies but in reality requires 500mb of random packages you've never even heard of, all while their build system doesn't do any pre comp checking so the build fails after a solid hours of compilation.
Yes this has risks. At the same time anytime you run any piece of software you are facing the same risks, especially if that software is updated from the internet. Take a look at the NIST docs in software supply chain risks.
Not completely correct. A lot of updaters work with signatures to verify that what was downloaded is signed by the correct key.
With bash curl there is no such check in place.
So strictly speeking it is not the same.
This is a bit like saying crossing the street blindfolded while juggling chainsaws and crossing the street on a pedestrian crossing while the light is red for cars both carry risk. Sure. One's a terrible idea though.
But those are two very different things, I can very easily give you a one liner using curl|bash that will compromise your system, to get the same level of compromise through a proper authenticated channel such as apt/pacman/etc you would need to compromise either their private keys and attack before they notice and change them or stick malicious code in an official package, either of those is orders of magnitude more difficult than writing a simple bash script.
Never have I ever piped curl to bash.
Oh, people will keep using it no matter how much you warn them.
Proxmox-helper-scripts is a perfect example. They'll agree with you until that site comes up, and then its "it'll never, ever get hacked and subverted, nope, can't happen, impossible".
Wankers.
Curl bash is no different than running an sh script you dont know manually…
True, but this is specifically about scripts you think you know, and how curl bash might trick you into running a different script entirely.
I'm a bit lost with
a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it. In the
You... You just.... You just dump the curl output to file and examine that and then run it if its good
Just a weird imagined sequence to me.
Worse than that, the server can change it's response based on user agent so you need to curl it to a file first, a browser could be served a completely different response.
Which is exactly what is demonstrated in the post. 🙃
Anytime I see a project that had this in their install instructions, I don't use that project.
It shows how dumb the devs are
An alternative that will avoid the user agent trick is to curl | cat, which just prints the result of the first command to the console. curl >> filename.sh will write it to a script file that you can review and then mark executable and run if you deem it safe, which is safer than doing a curl | cat followed by a curl | bash (because it's still possible for the 2nd curl to return a different set of commands).
You can control the user agent with curl and spoof a browser's user agent for one fetch, then a second fetch using the normal curl user agent and compare the results to detect malicious urls in an automated way.
A command line analyzer tool would be nice for people who aren't as familiar with the commands (and to defeat obfuscation) and arguments, though I believe the problem is NP, so it won't likely ever be completely foolproof. Though maybe it can be if it is run in a sandbox to see what it does instead of just analyzed.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:
| Fewer Letters | More Letters |
|---|---|
| DNS | Domain Name Service/System |
| HTTP | Hypertext Transfer Protocol, the Web |
| PiHole | Network-wide ad-blocker (DNS sinkhole) |
| SSL | Secure Sockets Layer, for transparent encryption |
| TLS | Transport Layer Security, supersedes SSL |
| VPS | Virtual Private Server (opposed to shared hosting) |
5 acronyms in this thread; the most compressed thread commented on today has 12 acronyms.
[Thread #111 for this comm, first seen 23rd Feb 2026, 04:40] [FAQ] [Full list] [Contact] [Source code]
Running arbitrary text from the internet through an interpreter.. what could possibly go wrong.
I need to set up a website with
fork while 1
...Just so I can (try to) convince people to
curl | perl
it
...rhyme intended.
You mean blindly running code is bad? /s
a more cautious user might first paste the url into the address bar of their web browser to see what the script looks like before running it.
Wow, I never thought anyone would be that dumb.
Why wouldn't they just wget it, read it, and then execute it?
Oh the example in the article is the nice version if this attack.
Checking the script as downloaded by wget or curl and then piping curl to bash is still a terrible idea, as you have no guarantee you'll get the same script in both cases:
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
-
No low-effort posts. This is subjective and will largely be determined by the community member reports.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!