Thoughts on dotfiles

Published

I've maintained a dotfiles repository since 2011 (at least that's when I initially put it into git), and haven't thought much about it since then. I wrote a basic "dotfiles installation" script back in 2013 and have been using and lightly modifying that ever since. My primary use case was just syncing dotfiles between my personal and corporate Macbooks, so it didn't need much flexibility. However, the Makefile wasn't trivial and doing simple things like adding new files required knowing the magic incantation (a combination of undocumented convention and GNU Make functions).1

I recently reevaluated my strategy since I was setting up a new workstation for the first time in a while, and it was a Linux box, which meant there would be configuration that wasn't compatible with my Macbook. I decided that I needed to improve the situation, and came up with a list of personal priorities.

  1. Minimize the required knowledge. I only go near my dotfiles manager once in a while, and so it's very easy to forget any customizations or processes I need to file. The ideal solution should be very easy to reason about and require few if any specialized commands or configuration files.
  2. Minimize the likelihood of failure. I might be configuring a MacOS or Linux machine of any distribution, and some software packages might not be available on the machine during installation. Installing my dotfiles should always work, regardless of the existing configuration of the target machine.
  3. Minimize the friction of updating. Sharing a modification from one machine to the rest should be as painless as possible. In general, I already have a working configuration file on one of my machines, and I just want that file, verbatim, on another machine.

Of course, everybody and their brother has their own preferred solution for managing their dotfiles, so naturally there's a plethora of prior art to choose from. I looked through all of the ones on Github Does Dotfiles, but (unsurprisingly) found that none were quite what I was looking for. When I was reviewing the list, I found a few factors that caused me to rule out most of the projects immediately:

Projects that bill themselves as "dotfile package managers". These all violate priority 1 for me by introducing a new set of commands/configuration to learn so I can manage my "dotfiles packages". Beyond that, the functionality to "update" my dotfiles to a version that someone else wrote that I have never reviewed is a huge violation of all of my priorities. There's no way I would introduce this level of complexity into my workflow.

Projects that depend on (node|python|ruby|gcc|apt). These violate priority 2 for me. Not all of my systems have a these language runtimes installed, and I don't want to install one to create some symlinks. Many scripting languages are also moving targets, requiring specific versions of the language runtimes in order to function properly, which might not be the same version installed by the OS package manager. The best projects are those that can be run directly from the git repository or have a single statically-linked binary distributed with the project.

Projects that interact with git. I use git every day and have a git workflow that I use every day. Dotfile managers that attempt to manage my dotfiles git repository (or "wrap git") end up putting me in situations outside of my "git happy path", and inadvertently cause more confusion than simplicity.

Projects that don't clean up unused configuration files. This one is minor, but most dotfiles managers are good at getting files onto your system, but bad at getting them off, leaving dangling symlinks in their wake. When I remove the configuration from my git repository, I want it to be removed from my home directory as well.

Building my own solution

Like so many before me, I decided to write my own solution: dfm. The goals for dfm line up with the three priorities from before:

  1. Simple. Conceptually, dfm should be as simple as cp, with no new concepts. When using it, there should be a small number of commands and few, if any, options for those commands. Everything should be clearly documented, but reading the documentation should not be necessary for daily use.
  2. Reliable. The program should be able to run on any freshly-configured Linux or MacOS system without any additional software installation. It should also work when running against an already-configured system, and not do anything silly like destroy the existing configuration.
  3. Easy. The program should have one command (or fewer!) to set up a new machine; one command to update the machine; and one command to update the repository from the machine.

I think dfm accomplished these goals well. In particular, I'm happy with the simplicity of the workflow:

  • dfm add will copy a file from your home directory into your git repository, then replace it with a symlink to the git repository. No configuration files need to be edited.
  • dfm link will sync your git repository to your home directory for both new files and removed files. It doesn't overwrite existing files by default, but using the industry-standard -f flag will force it.
  • There's a command to stop using dfm. I don't recall a single other dotfiles manager giving you a way out.

I did violate priority 1 a bit in dfm, by adding support for "multiple repositories". I decided to do this because it grants you quite a bit of added flexibility (configurations for different OSes/machine roles, "dotfiles packages" through submodules, local-only configurations through .gitignore, etc.), at the cost of a required option during initial configuration.

I stuck to my priorities as much as possible for the rest. Here are a few features that dfm does not have, because they would violate the priorities:

  • Templated files. Instead, I can write a script that runs dfm link and then generates additional files.
  • Script running. Instead, I can write a script that runs dfm link and then any other scripts I need.
  • Secret management. Instead, I can use an external tool to store the secrets and combine it with a script that runs dfm link and then installs the secrets.
  • Syncing. Instead, I can do this using git, like I do every day for all of my projects.

As I said before, everybody has their own custom solution for this, and I'm sure that for many of the solutions, the extra complexity they added is worthwhile to them. I wanted to build something that was deliberately simple and worked everywhere I am likely to find myself, and I think dfm does a good job of it.

My advice

Obviously, if your priorities are similar to mine, dfm is a project you should look into.

But really, whatever you decide, document it for yourself! Your repository README should at least cover setting up a new machine, adding a new config file, and syncing changes on an existing machine.

Footnotes

  1. To get an idea for what I was dealing with, here is the last version of that script before I deleted it.

A picture of Ryan Patterson
Hi, I’m Ryan. I’ve been a full-stack developer for over 15 years and work on every part of the modern tech stack. Whenever I encounter something interesting in my work, I write about it here. Thanks for reading this post, I hope you enjoyed it!
© 2020 Ryan Patterson. Subscribe via RSS.