Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
I'm tired of makefiles (dmitryfrank.com)
35 points by fanf2 on May 13, 2024 | hide | past | favorite | 50 comments


I personally had the complete opposite experience with make, the more I work with it the more I like it. You have to keep things simple, and IMO thats a good thing.

I also feel like good makefiles are usually very small and dont need much tweaking once they are in place.

I dont know your project structure but it may be advisable to actually split those builds completely or merge them together.

For the rebuilds you can just use the -B switch.

I am personally also a big fan of implicit compile and link rules, but I guess thats a matter of taste


I'm the same. I've used a lot of build systems over the years and they all get complicated fast. And they always have "spooky action at a distance". I prefer the simplicity and expliteness of makefiles.

Every time I've setup makefiles for projects someone always comes in and tries to make determining of dependencies happen before building rather than "determining dependencies as we build for the next build", which is just as robust. I don't know why software engineers always don't like that haha.


> I also feel like good makefiles are usually very small and dont need much tweaking once they are in place.

This only works for small projects. Once your project grows beyond a handful of files and dependencies this breaks down real fast.

I've been on this train more than once: most non-trivial projects reach a point where you have to build for multiple targets, support multiple configurations, test in all those weird configurations, manage CI, manage external dependencies, build said dependencies, etc. A makefile that supports all of the above becomes a monstrosity long before you get there and at some point you give up and pull in a real build system to take care of it (they're all bad in their own way, pick your poison really with cmake/meson/bazel/<new shiny thing>, just less bad than make).

Sure you can manage a giant pile of target-specific variables, platform-dependent scripts, and what not in make, but really it's much less painful to just bite the bullet and migrate to a build system that thought of these problems before. Things like "this dependency is mandatory is this configuration, optional in these other configurations, and is liked differently depending on whether we're building static or shared objects and whether we're using the platform library or falling back to a vendored version" is basically impossible in my experience.

Makefiles are great when your project is a dozen `.c` files. The moment you grow beyond that it's time to move on and use something else.


I have used make for decades and I dislike it.

That said, it is *useful* which is why it survives. A working makefile will generally build your product from scratch efficiently, and rebuilt it as well if you're developing.

But the shame is, all the effort I've put into writing makefiles over the years is tremendous. Spread among all software writers, excessive.

I think if effort had been put into evolving make (instead of writing makefiles), it would have been the tide to raise all the boats.

some simple examples

- add a "expect makefile version > n.nn" statement

- fix tab vs spaces

- better conditional stuff (think if/then/else, etc), maybe just logic in general

- allow snippets of other languages (shell or python) without embedding everything into makefile syntax

- more conscious handling of real targets (files) vs directories vs fake targets (like "clean" in make clean)

- easier (for users) handling of paths/directories/files

- maybe well-defined plugins, say a cross-compile-for-arm toolchain or similar

- lots more I can't recall.

I think cmake is kind of nice in a lot of respect (although it has its own issues as you get further along solving cross-platform build problems)



I like makefiles not as a build tool but as a way to put better ui around long difficult to type/remember commands on the repo

make up

make down

make clean

make upgrade

make migrate

Easier than having to think about it


I find `just` to be better suited than `make` for this use case: https://github.com/casey/just

The sibling's comment (re. just shell in files) is good too, and often I'll put all the various commands into a subdir, and `just` is just an interface to that, particularly so if a given script/command gets complex.

Then, `just` is really a signal "hey, this repo supports this interface" and things like `just -l` for discoverability.


I have never used just, but one major advantage of make is that it pretty much exists everywhere - I can assume it's present and I'll be right almost every time.


Sigh. The problem is just isn't as portable and doesn't do cached file dependencies. It tries to be better than make but it's a step backwards, more like yet another shell language.


The lack of cached dependencies is an advantage in most non-compiled languages, though.


Just looks nice but isn't very prevalent yet. One of make's strengths is that it is everywhere.


Couldn’t you do this with shell scripts? Make just runs commands anyway. If you’re not using make’s dependency-tracking features, might as well just use the shell directly and use the simplest tool for the job.


make still has a better UI than shell scripts. With shell scripts you have to maintain multiple files whereas a Makefile has all the scripts in a single file.


No you don't. I like makefiles because of the dependencies between targets but for simple multi-target things I still do shell scripts like

    mything() {
      # ...
    }

    otherthing() {
      # ...
    }

    if [ $# -eq 0 ]; then
        echo "Usage: ./run.sh [mything, otherthing]"
        exit 1
    fi

    "$@"


I've been writing scripts like this but with a case/esac at the bottom to map $1 to the right command. Have never thought about doing this instead. thank you for opening my eyes!


Been There, Done That, Bought The T-Shirt and leads to unmaintainable shit. One script per action. The end.


You express no nuance and only say that something is wrong without explaining why. Your communication skills lack "skill" and "communication". Your absolutism makes me convinced that you are are either wrong or incapable to explain why you would be right.


Maybe for your usecase, sure, but my usecase is my own dotfiles which has some of these helper scripts and it's a good fit there.


That's a god function antipattern and rapidly becomes non-portable.

Each lifecycle hook should be a an executable of some sort like "scripts/{{command}}", be it a shell script or something else.


Why is that a better UI?


wth. makefile can include other makefiles.


I'm not GP but:

Make gives me a single, predictable and stable interface to the repo.

From the root of the repo, I can type `make <tab>` and get a list of available tasks. Alternatively, there's a Makefile on the root.

With scripts, I need to type... what? `./<tab>` and look for clues? Maybe the scripts are in `scripts/`, maybe they're in `build/`, maybe they're in `ci/`...

Even if I know where the scripts are, `make task` is quicker and easier to type than `./scripts/task.sh`.

One could just use a make task to call a script, which I have done, but in most cases if you do that, what's the point of the script?


> what's the point of the script?

For starters, not having to indent every line with a tab, put backslashes at the end of each line, or double every $ sigil in the script. Being able to transparently replace it with perl or python or some other non-shell-script is an occasional bonus.


You picked the least important point I made, stripped it out of context and argued against it. This is a strawman. Feel free to use it as a piñata, I don't care.



I use a build system (and means of running project tasks) other than make, and I nonetheless usually include a Makefile in the root that does nary more than say something to the effect of, "What you're likely looking for is elsewhere," to accommodate this behavior.


Totally, I describe it as "shorthand for most used commands + lightweight documentation".

I made an entire presentation about this[1], trying to sell it to work mates.

[1]: https://jiby.tech/presentation/makefiles/makefiles.html


In my opinion the Linux kernel is the gold standard for makefile use (they call it kbuild).

It solves the cflags-changed problem by writing out the command used to build and updating it in the filesystem only if it has been changed (see "if_changed").

It has some external (to make) dependencies (a c program and numerous shell scripts) to make it work but it's great and can be relied upon!

It wouldn't be that crazy to get content-based dependencies working, especially looking at kbuild as an example.


I tend to use redo (https://redo.readthedocs.io/en/latest/) for new projects. I find it particularly well suited for multi-step data processing workflows, where I might want to retrieve new data daily or hourly, or when I switch from one environment to another.


I was describing nix to a friend and he offered this summary:

> it's make for grownups


Make would be so great if it had:

1. a way to track target freshness based on content hash

2. a way to declare variable values as dependencies


Both of these are easily achievable (if I understood the second one correctly; an example would help).


Have you worked with Nix much?


It seems that there is a lot of hate for make, but the second I see a project switching to cmake I run for the hills.


Make is ideal for trivial, portable projects. For everything else, cmake or something different that is simple, usable, and clearly documented in the project how to build, test, debug, and release.


Has anyone tried SCONS? Came across someone using it in a place where I worked earlier.

Python-based make-like tool.

https://scons.org/


Someone contributed an sconstruct file for a personal project of mine a long time ago. I needed a configure script to just set a few options a for a makefile but I got that instead. It seemed good at first but if you stray off the easy path it devolves into just a python script. You need to speak python because you start writing both configure and make in it.

Meson is another python based alternative which doesn't bleed into the makefile equivalent files. I don't use it myself because it lacked assembler support at one time.


It's been around for ages.

cmake and ninja, which can both be installed from PyPI. There's also a compiled 1:1 clone of ninja called samurai (samu).


Sounds like he wants the -B option.


For my personal needs, when it's not used as a build system but for collecting some shell functions in a file, I've completely switched to a combination of xonsh/python-fire and never looked back. Xonsh looks a bit weird, but it's superfast to write, and I always can read it perfectly. A messy example - one of my local scripts:

../bin/ok:

    #!/usr/bin/env xonsh
    """
    $> ok android shell
    $> ok android ssh
    $> ok android sshd
    $> ok android sms
    """
    import fire as _fire

    class android:
        ip = '...;

        def shell(self):
            adb connect f'{self.ip}:5555'
            adb shell

        def sshd(self):
            adb connect f'{self.ip}:5555'
            adb shell input keyevent 26
            adb shell input swipe 500 800 500 300 200
            sleep 1
            adb shell am start -n com.termux/.app.TermuxActivity
            adb shell input text "sshd"
            adb shell input keyevent 66
            adb shell input text "exit"
            adb shell input keyevent 66

        def ssh(self):
            ssh -p 8022 f'user@{self.ip}'

        def sms(self):
            result = $(ssh -p 8022 f'user@{self.ip}' termux-sms-list -l 20 | jq ".[].body")
            print(result)
            def long_echo(): print(result)
            @(long_echo) | llm -m llama3 -s 'please translate messages to English, quote the original messages too'

    if __name__ == '__main__':
        _fire.Fire()


Author is now trying Bazel

I'm going the Dagger path

This CNCF talk covers both: https://www.youtube.com/watch?v=nZLz0o4duRs


Author tried bazel 8 years ago, which is when the article was written. Judging by his GH repos, he's still using Makefiles.

https://github.com/dimonomid/bulgaria-freelance-taxes/blame/...


I was going to comment that I am tired of undated articles. Where did you see when it was written?


It's in the URL, I too missed it until pointed out


Anyone not working alone, not working on his own projects only, will end up having to use tools created ages ago, but refined gradually, how ever clunky the usage feels along the usage journey.


Do you call bazel from dagger or what? You still need a build runner, no?


Nope, just write code instead of Bazel or Make

Everything runs in a container

You can of course run whatever tool you want inside containerize workflows with Dagger, which can be much more than a build tool


Are you using a compiled language? IF so, what are you building with? That is the part that bazel optimizes. Once the build artifact is created, you can containerize it with dagger, of course.


Is the Bazel version consistent across all local dev & build ci? How do tool updates & versions work as developers switch between branches and repos?

Run Bazel in Dagger for consistency, or more generally build your binary in a container with Dagger using whatever tools you like. One main point of Dagger is the consistency between local & cloud for the build scripts, whereas containers have always been thought of as for consistency when running locally and cloud


(2016)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: