I wrote previously about how I backup database files automatically. The key word there being “automatically”. If I have to remember to make a backup the odds of it happening drop to zero. So I automate as I described in that piece, but that’s not the only backup I have.
The point for me as a writer is that I don’t want to lose these words.
Part of the answer is backing up databases, but part of my solution is also creating workflows which automatically spawn backups.
This is actually my preferred backup method because it’s not just a backup, it’s future proofing. PostgreSQL may not be around ten years from now (I hope it is, because it’s pretty awesome, but it may not be), but it’s not my only backup.
In fact I’ve got at least half a dozen backups of these words and I haven’t even finished this piece yet. Right now I’m typing these words in Vim and will save the file in a Git repo that will get pushed to a server. That’s two backups. Later the containing folder will be backed up on S3 (weekly), as well as two local drives (one daily, one weekly, both rsync copies).
None of that really requires any effort on my part. I do have to add this file to the git repo and then commit and push it to the remote server, but Vim Fugitive makes that ridiculously simple.
That’s not the end of the backups though. Once I’m done writing I’ll cut and paste this piece into my Django app and hit a publish button that will write the results out to the flat HTML file you’re actually reading right now (this file is another backup). I also output a plain text version (just append
.txt to any luxagraf URL to see a plain text version of the page).
The end result is that all this makes it very unlikely I will loose these words outright.
However, when I plugged these words into the database I gave this article a relationship with other objects in that database. So even though the redundant backups built into my workflow make a total data loss unlikely, without the database I will lose the relationships I’ve created. That’s why I a solid PostgreSQL backup strategy, but what if Postgres does disappear?
I could and occasionally do output all the data in the database to flat files with JSON or YAML versions of the metadata attached. Or at least some of it. It’s hard to output massive amounts of geodata in the text file (for example the shapefiles of national parks aren’t particularly useful as text data).
I’m not sure what the answer is really, but lately I’ve been thinking that maybe the answer is just to let it go? The words are the story, that’s what my family, my kids, my friends, and whatever few readers I have really want. I’m the only one that cares about the larger story that includes the metadata, the relationships between the stories. Maybe I don’t need that. Maybe that it’s here today at all is remarkable enough on its own.
The web is after all an ephemeral thing. It depends on our continued ability to do so many things we won’t be able to do forever, like burn fossil fuels. In the end the most lasting backup I have may well be the 8.5x11 sheets of paper I’ve recently taken to printing out. Everything else depends on so much.