1. 17
    1. 13

      I prefer shell for this kind of thing. When all your targets are .PHONY, you might as well use a shell script, since you’re not using the dependency engine.

      I touched on that in these two posts:

      Another reason I dislike using Make this way is that every line is implicitly a shell script. This means that writing the current PID in a makefile is $$$$. Writing the exit status is $$?, etc. Very ugly IMO.

      Heck don’t you have to write \ after every single line too? e.g. if you want to write a multi-line if statement or for loop in a Makefile. All the snippets shown here are one line, so it doesn’t come up, but I hit that pretty often.

      In other words, Make doesn’t upgrade gracefully to longer programs.

      On the other hand, many distros complete makefile targets after make on the command line. I use my own bash completion script to complete function names in run.sh.

      1. 5

        The second Makefile in the article does actually use dependencies to ensure the venv is active and updated at all times. But I do agree about all the problems with Makefiles. I prefer ninja build files, which I find far easier to read and write. Notably, ninja doesn’t provide phony targets that execute commands with no inputs or outputs.

        I have to say though: it’s no surprise that the author of a real, fully functional shell prefers shell scripts. 😉

        1. 3

          Yup Ninja is awesome! I just switched to it for part of Oil, and I plan to migrate the rest of the repo, and ditch Make altogether.

          I wrote a bunch of Makefiles from scratch over the last few years, and maintained them, so I could really “learn make”.

          In retrospect I could have just used Ninja and missed nothing. But then I wouldn’t know enough to criticize Make :)

          Recent comment that basically says the same thing: https://lobste.rs/s/ounjyq/compile_times_why_obvious_might_not_be_so#c_hrm1vg (also mentioned Ninja briefly on the latest blog post)

      2. 3

        I’ve gone back and forth. I used to be all-in on using a Makefile for consistency and discoverability, but lately I’ve been partial to a directory of reasonably-named shell scripts, for some of the reasons you mentioned.

    2. 5

      I also use a Makefile for my current Python side project some tricks I like:

      • Set SHELLFLAGS to -eu -o pipefail -c aka bash strict mode
      • Add --warn-undefined-variables and --no-builtin-rules to MAKEFLAGS
      • Use .RECIPEPREFIX to avoid wrestling with tabs

      Unfortunately I can’t seem to recall where I found this first, but my favorite is definitely creating a help target with the following content:

      help: ## Display this help section
      > @awk 'BEGIN {FS = ":.*?## "} /^[a-zA-Z0-9_-]+:.*?## / {printf "\033[36m%-38s\033[0m %s\n", $$1, $$2}' $(MAKEFILE_LIST)
      

      This basically self-document the Makefile, all you need is to add a comment (with double hash) after your target declaration.

      1. 3

        This was a level-up on that help command:

        ##@ Label sections
        .PHONY: help
        help:  ## Display this help
        	@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n  make \033[36m\033[0m\n"} /^[a-zA-Z0-9_-]+:.*?##/ { printf "  \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)
        
        ##@ Build commands
        .PHONY: build
        build: $(BIN) ## Build things
        	buildtool build
        

        Now you’ve got sections:

        $ make
        
        Usage:
          make
        
        Label sections
          help             Display this help
        
        Build commands
          build            Build things
        
    3. 4

      Sometimes I feel like I write just as much shell and Make as I do other languages. I’ve really dug into Make in the last ~3 years and just recently had the necessity to enmakify some Python projects.

      I belong to a school of thought that Make is the base build tool, the lowest common denominator. It’s already available on macOS, Linux, etc. out of the box and trivial to install on Windows. I’ve worked at companies with thousands of developers or with just dozens, with varying skill levels, familiarity with a particular ecosystem’s tooling, patience for poor or missing onboarding documentation, and general tolerance for other team’s preferences ranging from scared to flippant.

      Regardless, nearly all of them can figure out what to do if they can run make help and are presented with a menu of tasks that ideally are self-configuring entirely. As in, I clone a repo, run make help and see that there’s a test task that runs tests. I should then be able to run make test and… tests run. It may take some time to set up the environment — install pyenv, install the right Python version, install dependencies, etc. – but it will inevitably run tests with no other action required. This is an incredibly straightforward onboarding process! A brief README plus a well-written Makefile that abstracts away idiosyncracies of the repo’s main language’s package manager, build system, or both, can accelerate contributors, even if your Makefile is as simple as:

      help:
      	@echo Run build or test
      build:
      	npm build
      	sbt build
      test:
      	npm test
      	sbt test
      

      My base Pythonic Makefile looks like this now, not guaranteed to work because I’m plucking things. I’m stuck on 3.6 and 3.7 for now but hope to get these projects up to 3.9 or 3.10 by the end of the year. I’m using Poetry along with PyTest, MyPy, Flake8, and Black.

      # Set this to ~use it everywhere in the project setup
      PYTHON_VERSION ?= 3.7.1
      # the directories containing the library modules this repo builds
      LIBRARY_DIRS = mylibrary
      # build artifacts organized in this Makefile
      BUILD_DIR ?= build
      
      # PyTest options
      PYTEST_HTML_OPTIONS = --html=$(BUILD_DIR)/report.html --self-contained-html
      PYTEST_TAP_OPTIONS = --tap-combined --tap-outdir $(BUILD_DIR)
      PYTEST_COVERAGE_OPTIONS = --cov=$(LIBRARY_DIRS)
      PYTEST_OPTIONS ?= $(PYTEST_HTML_OPTIONS) $(PYTEST_TAP_OPTIONS) $(PYTEST_COVERAGE_OPTIONS)
      
      # MyPy typechecking options
      MYPY_OPTS ?= --python-version $(basename $(PYTHON_VERSION)) --show-column-numbers --pretty --html-report $(BUILD_DIR)/mypy
      # Python installation artifacts
      PYTHON_VERSION_FILE=.python-version
      ifeq ($(shell which pyenv),)
      # pyenv isn't installed, guess the eventual path FWIW
      PYENV_VERSION_DIR ?= $(HOME)/.pyenv/versions/$(PYTHON_VERSION)
      else
      # pyenv is installed
      PYENV_VERSION_DIR ?= $(shell pyenv root)/versions/$(PYTHON_VERSION)
      endif
      PIP ?= pip3
      
      POETRY_OPTS ?=
      POETRY ?= poetry $(POETRY_OPTS)
      RUN_PYPKG_BIN = $(POETRY) run
      
      COLOR_ORANGE = \033[33m
      COLOR_RESET = \033[0m
      
      ##@ Utility
      
      .PHONY: help
      help:  ## Display this help
      	@awk 'BEGIN {FS = ":.*##"; printf "\nUsage:\n  make \033[36m\033[0m\n"} /^[a-zA-Z0-9_-]+:.*?##/ { printf "  \033[36m%-15s\033[0m %s\n", $$1, $$2 } /^##@/ { printf "\n\033[1m%s\033[0m\n", substr($$0, 5) } ' $(MAKEFILE_LIST)
      
      .PHONY: version-python
      version-python: ## Echos the version of Python in use
      	@echo $(PYTHON_VERSION)
      
      ##@ Testing
      
      .PHONY: test
      test: ## Runs tests
      	$(RUN_PYPKG_BIN) pytest \
      		$(PYTEST_OPTIONS) \
      		tests/*.py
      
      ##@ Building and Publishing
      
      .PHONY: build
      build: ## Runs a build
      	$(POETRY) build
      
      .PHONY: publish
      publish: ## Publish a build to the configured repo
      	$(POETRY) publish $(POETRY_PUBLISH_OPTIONS_SET_BY_CI_ENV)
      
      .PHONY: deps-py-update
      deps-py-update: pyproject.toml ## Update Poetry deps, e.g. after adding a new one manually
      	$(POETRY) update
      
      ##@ Setup
      # dynamic-ish detection of Python installation directory with pyenv
      $(PYENV_VERSION_DIR):
      	pyenv install --skip-existing $(PYTHON_VERSION)
      $(PYTHON_VERSION_FILE): $(PYENV_VERSION_DIR)
      	pyenv local $(PYTHON_VERSION)
      
      .PHONY: deps
      deps: deps-brew deps-py  ## Installs all dependencies
      
      .PHONY: deps-brew
      deps-brew: Brewfile ## Installs development dependencies from Homebrew
      	brew bundle --file=Brewfile
      	@echo "$(COLOR_ORANGE)Ensure that pyenv is setup in your shell.$(COLOR_RESET)"
      	@echo "$(COLOR_ORANGE)It should have something like 'eval \$$(pyenv init -)'$(COLOR_RESET)"
      
      .PHONY: deps-py
      deps-py: $(PYTHON_VERSION_FILE) ## Installs Python development and runtime dependencies
      	$(PIP) install --upgrade \
      		--index-url $(PYPI_PROXY) \
      		pip
      	$(PIP) install --upgrade \
                                           		--index-url $(PYPI_PROXY) \
                                           		poetry
      	$(POETRY) install
      
      ##@ Code Quality
      
      .PHONY: check
      check: check-py check-sh ## Runs linters and other important tools
      
      .PHONY: check-py
      check-py: check-py-flake8 check-py-black check-py-mypy ## Checks only Python files
      
      .PHONY: check-py-flake8
      check-py-flake8: ## Runs flake8 linter
      	$(RUN_PYPKG_BIN) flake8 .
      
      .PHONY: check-py-black
      check-py-black: ## Runs black in check mode (no changes)
      	$(RUN_PYPKG_BIN) black --check --line-length 118 --fast .
      
      .PHONY: check-py-mypy
      check-py-mypy: ## Runs mypy
      	$(RUN_PYPKG_BIN) mypy $(MYPY_OPTS) $(LIBRARY_DIRS)
      
      .PHONY: format-py
      format-py: ## Runs black, makes changes where necessary
      	$(RUN_PYPKG_BIN) black --line-length 118 .
      

      Is this overkill? Maybe, but I can clone this repo and be running tests quickly. I still have some work to do to actually achieve my goal of clone-to-working-env in two commands — it’s three in git clone org/repo.git && make deps && make test right now – but I’ll probably get there in the next few days or weeks. Moreover, this keeps my CI steps as much like what developers run as possible. The only real thing that has to be set in CI are some environment variables that Poetry uses for the make publish step, plus setting the version with poetry version $(git describe --tags) because git describe versions are not PEP-440 compliant without some massaging and I’ve been lazy doing that when our published tags will always be PEP-440 compliant.

      The Brewfile:

      # basic build tool, get the latest version
      # if you want to ensure use, use 'gmake' instead on macOS
      # or follow caveats in `brew info make` to make make brew's make
      brew 'make'
      
      # python version and environment management
      brew 'pyenv'
      # python dependency manager
      # a version from pypi instead of homebrew may be installed when running make deps
      brew 'poetry'
      

      The full pyproject.toml is an exercise left to the reader but here’s the dev-dependencies selection from one of them:

      [tool.poetry.dev-dependencies]
      flake8 = "3.7.9"
      black = "19.10b0"
      mypy = "^0.812"
      pytest = "^6.2.2"
      pytest-html = "^3.1.1"
      ansi2html = "*"
      pytest-tap = "^3.2"
      pytest-cov = "^2.11.1"
      pytest-mypy = "^0.8.0"
      lxml = "^4.6.2"
      

      Suggested improvements welcome. I’ve built Makefiles like this for Scala, Ruby, Rust, Java, C, Scheme, and Pandoc projects for a long time but feel like Make really is like Othello: a minute to learn, a lifetime to master.

      1. 4

        Regardless, nearly all of them can figure out what to do if they can run make help and are presented with a menu of tasks that ideally are self-configuring entirely. As in, I clone a repo, run make help and see that there’s a test task that runs tests. I should then be able to run make test and… tests run. It may take some time to set up the environment — install pyenv, install the right Python version, install dependencies, etc. – but it will inevitably run tests with no other action required.

        This is the reason we use Makefiles on my teams. Like you say, it’s the lowest common denominator, the glue layer, which enables things like multi-language projects to be managed in a coherent manner. I’d much rather call out to both npm and gradle from the Makefile, than use weird plugins to shovel npm into gradle or vice-versa. Makefiles are scripts + a dependency graph, so you can do things like ensuring particular files are in place before running commands, and this is not just about build artifacts, but also files downloaded externally (hello chromedriver).

        I have Makefiles in all my old projects, and they are a real boon when I need to make changes after a long time (like years) has passed. I also make it a habit to always plug this make tutorial whenever the topic of Make comes up. It’s a stellar tutorial.

        To quote Joe Armstrong: “Four good tools to learn: Emacs, Bash, Make and Shell. You could use Vi, I am not religious here. Make is pretty damn good! I use Make for everything, that good!”

      2. 2

        Could this 114 line Makefile be a gist or something?

    4. 1

      I just have my makefiles run commands in the venv using poetry-run

    5. 1

      I switched away from Makefiles to a python -m make style setup for Kanmail, specifically because I felt I wasnt using any of makes actual features (dependencies etc) and spent a lot of effort dealing with its qwirks. Like the venv stuff though which is neat!