Makefile is “make file“. It has been abused into being a task runner. The wrong tool for the job.
The “make file” is all about file dependencies based on last modified date. Outdated target files can be rebuilt using source file. It’s is dependency management and the essence of an incremental compiler, but all revolving around files, not tasks.
What gives make a bad name is the same thing that gave javascript or m4 a bad name - these things are their own exotic birds - doing them well require new concepts and new behaviors.
You can indeed shoehorn them into what you know but really you need to fully embrace their weird world.
See also forth, dc, awk, jq ...
It'd be nice to have a dedicated crash course on these things for people who understand conventional programming and have been doing the normal stuff for a number of years.
Also see supercollider, prolog, haskell, apl...
I think the most mainstream exotic bird people learn is Lisp. Doing all these things well is as different as Lisp is from say conventional python.
It may suck for it, but it’s better than a collection of random scripts and commands baked into CI configuration that evolve to become unrunnable in normal dev environments.
It’s almost comical to see “why Python” comments after all these years. I would’ve chosen Go to write this, but that’s beside the point.
Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But even then, some Python libraries have bigger communities than the combined communities of all these “better, faster, more awesome” languages.
Python is here to stay. Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts. That’s fine. LLMs write better Python than Go (my preferred language, or whatever yours is). And if you know anything about the AI research community, it’s C++, C, Python or GTFO.
Going forward, a lot more tools will be written in Python, mostly by people entering the field. On top of that, there’s a huge number of active Python veterans churning out code faster than ever. The network effect keeps on giving.
So whatever language you have in mind, it’s going to be niche compared to Python or JS. I don’t like it either. But if languages and tools were chosen on merit instead of tribalism, we wouldn’t be in this JS clusterfuck on the web.
I love python, have used it for years. I hate the dependency and multiple interpreter situation.
A great PL should stand on its own without the need for external tooling.
At this point I have given up on python except for if it’s a little script that only uses standard libraries. Otherwise I’m choosing a compiled language.
I use python without any dependencies on web servers. Pip is cool, but you don't need to get pulled into the node-like dependecy hell.
For example instead of requests, you can use http.client, instead of flask, http.server, or socket.tcpserver, or just socket. If you want sqlite, don't jump to pip install sqlite or whatever, use sockets to talk to it.
> It’s almost comical to see “why Python” comments ... Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But ... Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts.
I'm not sure if this is news to you or if you already know it, but, just to be explicit -- you know that the overwhelming majority of end users aren't gonna have `pip` installed on their systems, right? And that any project with "Installation instructions" that begin with a `pip` command aren't really gonna work in the general case?
Just wanna make sure that's well-understood... it's fine if you wanna build a tool in Python, but if you expect it to be practically usable, you need to do distribution of binaries, not `pip` targets...
This point has been pummeled to death for decades. Before Python, people did the same with Ruby and “gem.” Literally nothing is new here.
One of the reasons I write my tools in Go is exactly this. But if the tool was written in Go, people would complain about why not Rust and such. The point wasn’t to convey that Python doesn’t have its fair share of flaws, but to underscore that the HN crowd doesn’t represent any significant majority. The outside world keeps on using Python, and the number of Go or Rust users is most likely less than PyTorch or Scikit-learn users.
Shipping Python is hard and the language is slow. Also, tooling is bad. The newfangled ones are just a few in the long stream of ad hoc tooling over the past 20 years. Yet people write Python and will continue to do so. JS has a similar story, but it’s just a 10x worse language than Python.
Let me be even more explicit: if your installation instructions are `pip install ...` -- or `npm install ...` for that matter -- then you are automatically excluding a super-majority of potential users.
I don’t even write python these days. I just wrote my own version of a terminal llm-caller[^1] in Go for this exact same reason.
There’s a famous one that does the same thing but is written in Python. So it has its issues.
My point is, pip exists in most machines. pip install sucks but it’s not the end of the world. HN crowd (including myself) has a tendency to beat around the bush about things that the majority don’t care about IRL.
some Makefiles use indents or var placement as semantic cues. if a tool rewrites them mechanically, it might clean things while killing meaning. is structural correctness enough, or do we need formatters that preserve human context too?
Ideally, we'd have linters that preserve the human context as well. But human context may be too ambiguous and high variance enough that it can be impractical.
It's hard to say what's intent and what not, maybe linters with many custom rules would work best.
Why not Python? I primarily program in C++ but I see it as a decent choice as Python is available in almost all recent machines. Of course Windows is a notable exception but given it's a tool for developers I guess Python should be present.
The number of issues we've had with pre-commit because it's written in Python and Python tooling breaks constantly...
In fairness, the latter point may be finally solved by using `uv` and `uv tool install`. Performance is still a major issue though. Yamllint is easily the slowest linter we use.
(I'm tempted to try rewriting it in Rust with AI.)
Performance only matters if you're doing something compute- or disk-intensive, and then only if the libraries you're using are Python all the way down. (AI programming, at least the kind that most of us do--I don't know about places like OpenAI) is generally done with Python using libraries that use some compiled language under the hood.
And in this case--a linter--performance is almost certainly never an issue.
Then remove it? There's always tradeoffs adding tooling - I'm assuming you have it in your workflow to catch downstream issues because it saves more time in the long run.
It definitely is a problem when the tool you're going to use a few times a week takes an extra hundred milliseconds compared to a native solution. Especially when you need to process huge data files like hand crafted makefiles. I can totally feel your pain - extra effort would've been made to avoid that at the cost of development speed. /s
I find that writing anything substantially complex in python sacrifices the development speed. That isn't its strong suit. It's that a lot of people want to write their code in it by preference.
Yeah if only it was an extra 100 milliseconds a few times a week. We have yamllint (also written in Python) in our pre-commit (also written in Python) and it definitely adds a second or two.
`pip install ...` is not a reliable or appropriate mechanism for distribution of any kind of tool like this one. Table stakes is pre-compiled architecture-specific binaries.
I would've chosen Java because it's faster than Python and is good for string manipulation. My cousin would've chosen Brainfuck because he's really good at it. Alas, this discussion is useless because none of us are the one who spent the effort to write the Makefile formatter and linter, we can only bikeshed all day about what decisions we would've taken
For the 100th time, I was responding to "[...] python and it is easy to do string manipulation with.". Perl is way better to do string processing in than Java, too, FWIW.
My comment has absolutely nothing to do with this project or its author, nor the language he has chosen. See the other comments in this thread.
I am not saying it is weird, I was just responding to parent with regarding to "string manipulation", and someone mentioned "performance", so I stated two facts about Perl.
I do not care whether or not this project is written in Python. Sure, he chose Python because he is more familiar with it. That is fair enough to me.
It really is though. When I was on the journey to find the fastest interpreted scripting languages, Perl and LuaJIT were one of the fastest ones, meaning Python is slower than both of these languages.
It does not invalidate anything I have said, and this reasoning of yours is so flawed.
For example: I thought this or that music or movie sucked. I do not need to know how to make a song or a movie to be able to criticize it, let alone have one in a similar vain, same with books. I can criticize a book, or an article, without having written one myself on related topics.
All that said, where did I criticize? I did not criticize anything, at all.
I stated facts. Perl is indeed faster than Python, and Perl was indeed made with string manipulation in mind. I made no comment about this project or its author, thus, it was not a criticism of any sort.
I do not care about movie makers and singers in general, and for the most part I do not have direct contact with them, so it would be futile to offer any advice. I did offer advice to a couple of singers before though. What is your point besides being unnecessarily defensive over two simple, stated facts? As I said, it was not a criticism of the author or the project, it was a response to your comment. Since this is going to lead nowhere, I am going to stop responding to this thread.
Remember, you cannot criticize (even though it was not it) unless you have something to show up! Next time someone provides a critique to an article, we have to make sure to let them know it is wrong to criticize unless they have written an article themselves on the same topic.
FWIW it really was just about his comment, and I made two statements: Perl is faster than Python, and that Perl is especially good for string manipulation. I do not mind that he chose Python, good for him.
Make allows you to specify dependencies for you targets, which are also targets. As such you do not need to rely on brittle string concatenation approaches. It is a tool build for this.
I personally like going to a project folder and run "make run", no matter what language or setup I have, to run the project. It enables me to unify access to projects.
I also take great care to make these runs reproducible, using lock files and other things of the ecosystems I am using, whenever possible. I work on another machine? git clone, make run. Or perhaps git clone, make init, make run.
I'm not so sure most people would agree with you. Though I think plenty would.
I dare say that developers like environment variables more than before. Consider that Docker images, and hence Helm charts, are entirely controlled via environment variables. These very popular dev tools suffer from the same problem of having near-zero easy discoverability of what those environment variables might be. Yet they are very popular.
But I don't think Make usually uses all that many environment variables. You're usually specifying build targets as the command line arguments. Automake and autogen usually generate these makefiles with everything hard-coded.
Also, it makes it very easy to get started with, and it is universally available. Makes it very easy to like.
Unless your company forces you to use Windows, which is still much more common than many would like to admit. And yes, WSL exists, but in my experience, if a company is unwilling to allow macOS, there’s a good chance they either don’t allow enabling HyperV, or the security software they use is such garbage that it results in a HyperV enabled system being effectively unusable.
Dunno, there are other aspects of environment variables that deteriorate the dev experience. They're very conducive to spooky action at a distance, since they're silently being passed along from parent-process to child-process (except when they aren't).
They can cause a lot of annoying bugs, and sometimes it's hard to track down where they are coming from (especially when dealing with stuff running in containers).
> Interacting with CLI tools using only env vars as arguments is cartoonishly bad dev experience.
Make excels at what it's design to do: specify a configurable DAG of tasks that generate artifacts, execute them, and automatically determine which subgraph requires updates and which can be skipped by reusing their artifacts.
I wonder: which tool do you believe does this better than Make?
I don't think Tup managed to present any case. Glancing at the page, the only conceivable synthetic scenarios where they can present Tup in a positive light is built times of > 10k files, and only in a synthetic scenario involving recompiling partially built projects. And what's the upside of those synthetic scenarios? Shaving w couple of seconds in rebuilds? That's hardly a compelling scenario.
Abuse? Runnig linters, code analysers, configuration tools, template engines, spellcheckers, pulling dependencies, building dependencies with different build systems.
Sufficiently complex project need to invole alot of wierd extra scripts, and if a build system cannot fulfil it... the n it needs to be wrapped in a complex bash script anyway.
You don't have to write Make invocations by hand... It's just a tool that can be called from any editor or IDE (or by automatic file watchers). Environment variables aren't really relevant to Make either, unless you really want to misuse it as a command runner.
I often prefer to work in in extremis environments where there is no internet access, and hence, no easy way to get ahold of make; it's given me a bad habit of just waiting a build.bash script to do what make does most of the time. I haven't really found myself missing it that much.
If you can install bash on your airgapped dev box, why wouldn't you install make on it? Make is part of the core dev environment on just about every disto under the sun.
I am confused, because this means that you won't be able to install anything. No compiler, no 3rd party libraries and no text editor that isn't preinstalled
It's 80% of what you want and it's installed everywhere.
You could go for something closer to exactly what you want, but now you've got an extra set up step for devs and something else for people to learn if they want to change it.
I would say if you're looking for cli args then you shouldn't be using any wrapper like make at all. Just call the underlying tool directly. Make is for doing big high level things in the standard way, nowadays quite often in CI pipelines.
Yep, that's how I used it on the job before. "make test" would run tests locally and in CI pipeline, keeping the CI file refreshingly short at that point.
The “make file” is all about file dependencies based on last modified date. Outdated target files can be rebuilt using source file. It’s is dependency management and the essence of an incremental compiler, but all revolving around files, not tasks.
You can indeed shoehorn them into what you know but really you need to fully embrace their weird world.
See also forth, dc, awk, jq ...
It'd be nice to have a dedicated crash course on these things for people who understand conventional programming and have been doing the normal stuff for a number of years.
Also see supercollider, prolog, haskell, apl...
I think the most mainstream exotic bird people learn is Lisp. Doing all these things well is as different as Lisp is from say conventional python.
On Lisp, exotic? it's damn easy. Haskell it's far worse.
But most people don’t realize in many cases they can do better than that.
* https://git.maandree.se/makel
Or unmake.
* https://crates.io/crates/unmake
Or checkmake.
* https://github.com/checkmake/checkmake (https://news.ycombinator.com/item?id=32460375)
Or make-audit.
* https://github.com/david-a-wheeler/make-audit
Or the Sublime linter for makefiles.
* https://github.com/giampaolo/SublimeLinter-contrib-makefile
It hasn't taken quite the 50 years that we are told, has it? (-:
It’s almost comical to see “why Python” comments after all these years. I would’ve chosen Go to write this, but that’s beside the point.
Yes, Python installation is tricky, dependency management is a mess (even with uv, since it’s not standard tooling; another one will pop up), and performance is atrocious. But even then, some Python libraries have bigger communities than the combined communities of all these “better, faster, more awesome” languages.
Python is here to stay. Newcomers love it because you need to know so little to get started. People don’t care about the little quirks when they begin, and eventually they just live with the warts. That’s fine. LLMs write better Python than Go (my preferred language, or whatever yours is). And if you know anything about the AI research community, it’s C++, C, Python or GTFO.
Going forward, a lot more tools will be written in Python, mostly by people entering the field. On top of that, there’s a huge number of active Python veterans churning out code faster than ever. The network effect keeps on giving.
So whatever language you have in mind, it’s going to be niche compared to Python or JS. I don’t like it either. But if languages and tools were chosen on merit instead of tribalism, we wouldn’t be in this JS clusterfuck on the web.
A great PL should stand on its own without the need for external tooling.
At this point I have given up on python except for if it’s a little script that only uses standard libraries. Otherwise I’m choosing a compiled language.
Some more thoughts: http://calvinlc.com/p/2025/06/10/thank-you-and-goodbye-pytho...
For example instead of requests, you can use http.client, instead of flask, http.server, or socket.tcpserver, or just socket. If you want sqlite, don't jump to pip install sqlite or whatever, use sockets to talk to it.
I'm not sure if this is news to you or if you already know it, but, just to be explicit -- you know that the overwhelming majority of end users aren't gonna have `pip` installed on their systems, right? And that any project with "Installation instructions" that begin with a `pip` command aren't really gonna work in the general case?
Just wanna make sure that's well-understood... it's fine if you wanna build a tool in Python, but if you expect it to be practically usable, you need to do distribution of binaries, not `pip` targets...
One of the reasons I write my tools in Go is exactly this. But if the tool was written in Go, people would complain about why not Rust and such. The point wasn’t to convey that Python doesn’t have its fair share of flaws, but to underscore that the HN crowd doesn’t represent any significant majority. The outside world keeps on using Python, and the number of Go or Rust users is most likely less than PyTorch or Scikit-learn users.
Shipping Python is hard and the language is slow. Also, tooling is bad. The newfangled ones are just a few in the long stream of ad hoc tooling over the past 20 years. Yet people write Python and will continue to do so. JS has a similar story, but it’s just a 10x worse language than Python.
There’s a famous one that does the same thing but is written in Python. So it has its issues.
My point is, pip exists in most machines. pip install sucks but it’s not the end of the world. HN crowd (including myself) has a tendency to beat around the bush about things that the majority don’t care about IRL.
[1]: https://github.com/rednafi/q
It's hard to say what's intent and what not, maybe linters with many custom rules would work best.
https://aur.archlinux.org/packages/python-bake-git
does this happen to support IDE like vscode?
2. Terrible installation UX.
The number of issues we've had with pre-commit because it's written in Python and Python tooling breaks constantly...
In fairness, the latter point may be finally solved by using `uv` and `uv tool install`. Performance is still a major issue though. Yamllint is easily the slowest linter we use.
(I'm tempted to try rewriting it in Rust with AI.)
Performance only matters if you're doing something compute- or disk-intensive, and then only if the libraries you're using are Python all the way down. (AI programming, at least the kind that most of us do--I don't know about places like OpenAI) is generally done with Python using libraries that use some compiled language under the hood.
And in this case--a linter--performance is almost certainly never an issue.
Performance only matters if you care about performance, and I do care about performance. If you don't, fine I guess.
Also format-on-save is a common workflow.
My comment has absolutely nothing to do with this project or its author, nor the language he has chosen. See the other comments in this thread.
I do not care whether or not this project is written in Python. Sure, he chose Python because he is more familiar with it. That is fair enough to me.
For example: I thought this or that music or movie sucked. I do not need to know how to make a song or a movie to be able to criticize it, let alone have one in a similar vain, same with books. I can criticize a book, or an article, without having written one myself on related topics.
All that said, where did I criticize? I did not criticize anything, at all.
I stated facts. Perl is indeed faster than Python, and Perl was indeed made with string manipulation in mind. I made no comment about this project or its author, thus, it was not a criticism of any sort.
FWIW it really was just about his comment, and I made two statements: Perl is faster than Python, and that Perl is especially good for string manipulation. I do not mind that he chose Python, good for him.
I personally like going to a project folder and run "make run", no matter what language or setup I have, to run the project. It enables me to unify access to projects.
I also take great care to make these runs reproducible, using lock files and other things of the ecosystems I am using, whenever possible. I work on another machine? git clone, make run. Or perhaps git clone, make init, make run.
I dare say that developers like environment variables more than before. Consider that Docker images, and hence Helm charts, are entirely controlled via environment variables. These very popular dev tools suffer from the same problem of having near-zero easy discoverability of what those environment variables might be. Yet they are very popular.
But I don't think Make usually uses all that many environment variables. You're usually specifying build targets as the command line arguments. Automake and autogen usually generate these makefiles with everything hard-coded.
Also, it makes it very easy to get started with, and it is universally available. Makes it very easy to like.
Where it's less great is complicated recipes and debugging
They can cause a lot of annoying bugs, and sometimes it's hard to track down where they are coming from (especially when dealing with stuff running in containers).
Like sudo for example.
So many problems related to that.
Make excels at what it's design to do: specify a configurable DAG of tasks that generate artifacts, execute them, and automatically determine which subgraph requires updates and which can be skipped by reusing their artifacts.
I wonder: which tool do you believe does this better than Make?
But the Internet’s make mind-share means you still have to know make.
Edit: and make lets you use make to essentially run scripts/utils. People love to abuse make for that. Can’t do that with tup.
I don't think Tup managed to present any case. Glancing at the page, the only conceivable synthetic scenarios where they can present Tup in a positive light is built times of > 10k files, and only in a synthetic scenario involving recompiling partially built projects. And what's the upside of those synthetic scenarios? Shaving w couple of seconds in rebuilds? That's hardly a compelling scenario.
Sufficiently complex project need to invole alot of wierd extra scripts, and if a build system cannot fulfil it... the n it needs to be wrapped in a complex bash script anyway.
`tup` relies on a stateful database, which makes it incomparable to `make`.
You could go for something closer to exactly what you want, but now you've got an extra set up step for devs and something else for people to learn if they want to change it.
I would say if you're looking for cli args then you shouldn't be using any wrapper like make at all. Just call the underlying tool directly. Make is for doing big high level things in the standard way, nowadays quite often in CI pipelines.