Compare commits

..

72 Commits

Author SHA1 Message Date
a1cf833079 initial-setup: include systemd unit in vars 2025-03-15 12:52:39 -04:00
2024519b16 initial-setup: fix script permissions 2025-03-15 11:12:53 -04:00
90e9310ae0 Update README, add logs.sh 2025-03-15 11:08:52 -04:00
a9ab963d49 initial-setup: support different systemd unit names 2025-03-14 23:34:32 -04:00
eec423bc6f initial-setup: bugfixes 2025-03-14 15:00:29 -04:00
8caf8e04fb initial-setup: clean up, add PORT option, improve --update 2025-03-14 14:16:18 -04:00
b16c4d7330 Support recovery by setting up an existing repo 2024-07-01 14:24:28 -04:00
59ceff6d1a Update Pipfile.lock for newer Cython compatibility 2023-11-03 13:52:32 -04:00
9703c5fc72 bin: rebuild borg.x86_64 with staticx 0.13.8 2022-09-17 11:55:44 -04:00
c68b867b50 Add notes about host ID 2021-11-16 10:11:08 -05:00
7dea155f58 use python3 when getting UUID 2021-11-16 09:55:58 -05:00
f6e8863128 backup: adjust email formatting 2021-10-26 21:37:53 -04:00
342e2cd0e8 Update README 2021-10-26 16:03:52 -04:00
f14b0d2d4d backup.py: fix notification error 2021-10-26 16:00:46 -04:00
b74f9c75a2 initial-setup: fix --update 2021-10-26 15:55:32 -04:00
9b38c248d8 borg: add ARM binary for Pi; update scripts to use it 2021-10-26 15:50:08 -04:00
46f9f98860 backup: show errors at top of email notification 2021-10-26 13:24:39 -04:00
dc7d72b2da initial-setup: make systemd units restart on failure 2021-10-26 13:24:28 -04:00
cb12e09c46 backup: rework output to make notification emails easier to read 2021-10-26 13:20:21 -04:00
1115d1f821 backup: rename pstr() helper to b2s()
The helper is just bytes->str conversion with errors=backslashreplace,
which we can use for more than just paths.
2021-10-26 12:54:41 -04:00
e2f92ccb7a readme: fix typo 2021-10-19 14:48:28 -04:00
a15cb5b07d all: remove concept of read-write key
We don't need a read-write key: we can just SSH directly to
jim-backups@backup.jim.sh instead and run commands thay way.
Remove read-write key and document it in the README.

Also add some tools to update the README variables on updates.
2021-10-19 14:46:34 -04:00
51c5b5e9ca backup: fix prune archive name 2021-10-19 12:28:20 -04:00
ed8ea15aa7 backup: only prune archives that match default naming pattern 2021-10-19 12:18:46 -04:00
481e01896b backup: fix issue with ignoring "changed while we backed it up" warnings 2021-10-19 12:14:42 -04:00
e85e08cace backup: call prune after backup; add run_borg helper
Automatically prunes after backup, although this doesn't actually
free up space (because we're in append-only mode).
2021-10-19 11:16:17 -04:00
4b7802ad5f backup: flush stderr after all writes 2021-10-18 19:35:39 -04:00
4a30b82e39 backup: replace simple max size with rule-based system
Now individual files or patterns can have their own maximum sizes.
2021-10-18 17:43:33 -04:00
ac12b42cad backup: rename force-include to unexclude
Force-include is a misnomer because it won't include files
that weren't considered at all (like files in an excluded subdir).
Instead, call it "unexclude" to make it slightly clearer that this
will just override the exclusions.
2021-10-18 16:25:23 -04:00
97b9060344 make: add targets to help view log status 2021-10-18 15:13:10 -04:00
16fe205715 backup: remove pathnames from progress output
It clutters up the output and isn't super useful
2021-10-18 15:03:32 -04:00
1fb8645b27 build: make Borg.bin a static binary
This prevents it from e.g. needing a specific glibc version on the
client.
2021-10-17 21:31:16 -04:00
b1748455a0 setup: add pipenv check 2021-10-17 21:16:52 -04:00
d413ea3b82 setup: prevent pager in systemctl list-timers 2021-10-17 21:03:15 -04:00
1932a76f72 prune: remove -v option to support old ssh-add 2021-10-17 20:05:47 -04:00
81d430b56b backup: print exceptions from reader thread 2021-10-17 20:02:46 -04:00
2d89e530be backup: split handling of log_message and progress_message 2021-10-17 20:01:55 -04:00
3024cf2e69 backup: stop main thread if reader thread dies unexpectedly
_thread.interrupt_main will trigger KeyboardInterrupt in the main thread.
2021-10-17 20:00:55 -04:00
a540f4336f backup: fix nonlocal variable issue with errors 2021-10-17 19:31:56 -04:00
643f41d7f7 backup: tweak types for python 3.7 compatibility 2021-10-17 19:31:36 -04:00
1beda9d613 setup: pick host-dependent start time 2021-10-17 09:26:41 -04:00
8d7282eac1 borg.sh: fix ssh option for read-write mode 2021-10-17 08:56:53 -04:00
2b81094a32 backup: fix borg exit code handling for ret=0 2021-10-17 01:05:03 -04:00
e7b0320c9f backup: fix ignoring of harmless borg warnings 2021-10-17 00:55:49 -04:00
a18b9ed6d0 backup: track errors/warnings from borg; add prefix to them
This also ignores the "file changed while we backed it up" error, because
that isn't important enough to warrant sending an email.
2021-10-17 00:16:43 -04:00
756dbe1898 backup: fix mypy-detected errors 2021-10-17 00:14:14 -04:00
ed1d79d400 makefile: reload systemd unit files after rebase 2021-10-16 23:47:34 -04:00
2caceedea7 backup: show detailed progress from borg 2021-10-16 23:40:36 -04:00
42edd0225d setup: fix bitwarden entry name 2021-10-16 19:21:09 -04:00
ad13bb343a make: add helper to rebase local branches to incorporate upstream changes 2021-10-16 18:52:34 -04:00
f2b47dcba2 backup: parse vars.sh and use hostname from that 2021-10-16 18:52:34 -04:00
d1d561cb70 setup: allow hostname to be overridden 2021-10-16 18:48:43 -04:00
6066188ef1 vars: remove duplicate host_id 2021-10-16 09:45:50 -04:00
f70bffed37 misc: ignore .venv dir 2021-10-16 09:37:04 -04:00
979dfd892f backup: revert to catching fewer exceptions
We specifically don't want to catch BrokenPipeError; just list
file-related ones that we might expect to see if we hit bad
permissions, disk errors, or race conditions.
2021-10-16 09:37:04 -04:00
ab6dce0c2c borg: update binary to fix upstream bug 6009 2021-10-16 09:37:04 -04:00
aff447c1b6 notify: fix notify.sh to work with server side; adjust text 2021-10-16 09:37:03 -04:00
f7e9c3e232 borg.sh: only try ssh keys, not password authentication 2021-10-16 09:37:03 -04:00
d168c5bf54 backup: catch all OSError exceptions while accessing files
We might see these if files change during the scan, for example.
2021-10-16 09:37:03 -04:00
31d88f9345 backup: print final results and run notification script on error 2021-10-16 09:37:03 -04:00
ccf54b98d7 backup: fix archive name
Was overly quoted from when this was a shell script
2021-10-16 09:37:03 -04:00
59ad2b5b4d backup: capture borg output for later reporting 2021-10-16 09:37:03 -04:00
0c74f1676c backup: add bold option to log(); simplify logic 2021-10-16 09:37:03 -04:00
5e06ebd822 backup: change some warnings into errors 2021-10-16 09:37:03 -04:00
929a323cf0 notify: add ssh key for running remote notifications; add notify.sh 2021-10-16 09:37:03 -04:00
86bb72f201 setup: fix borg path in initial connection test 2021-10-16 09:37:03 -04:00
54437456ae prune: use new vars.sh 2021-10-16 09:37:03 -04:00
c7a6d08665 initial-setup: generate vars.sh instead of borg.sh; commit borg.sh
Put setup-time variables into a generated vars.sh, and put borg.sh
directly into the repo.
2021-10-16 09:36:56 -04:00
3c5dcd2189 config: remove /efi, it probably doesn't exist 2021-10-15 23:21:28 -04:00
1a44035ae8 makefile: fix test-backup target 2021-10-15 23:21:14 -04:00
6830daa2b1 prune: save password in an SSH agent, and compact after pruning
Since we want to run two commands, use a temporary SSH agent
to hold the key, so that the user only has to enter the password
once.
2021-10-14 15:34:10 -04:00
69bfecd657 borg: include borg binary in repository
Put our own binary in here, so we can keep it updated with local
patches more easily.  Also add build instructions.
This one is built from
https://github.com/borgbackup/borg/pull/6011
2021-10-14 13:26:24 -04:00
15 changed files with 914 additions and 480 deletions

1
.gitea/README.md Symbolic link
View File

@@ -0,0 +1 @@
../templates/README.md

4
.gitignore vendored
View File

@@ -1,7 +1,9 @@
.venv
*.html *.html
Borg.bin
cache/ cache/
config/ config/
key.txt key.txt
passphrase passphrase
ssh/ ssh/
/README.md
/notify.sh

View File

@@ -19,7 +19,7 @@ ctrl: test-backup
.PHONY: test-backup .PHONY: test-backup
test-backup: .venv test-backup: .venv
.venv/bin/mypy backup.py .venv/bin/mypy backup.py
./backup.py | tr '\0' '\n' #-n ./backup.py -n
.PHONY: test-setup .PHONY: test-setup
test-setup: test-setup:
@@ -31,6 +31,27 @@ test-setup:
#git ls-files -z | tar --null -T - -cf - | tar -C /tmp/test-borg -xvf - #git ls-files -z | tar --null -T - -cf - | tar -C /tmp/test-borg -xvf -
/tmp/test-borg/initial-setup.sh /tmp/test-borg/initial-setup.sh
# Pull master and rebase "setup-$HOSTNAME" branch onto it
.PHONY: rebase
rebase:
git checkout master
git pull
git checkout -
git rebase master
./initial-setup.sh --update
systemctl daemon-reload
git status
# Show status of most recent backup run
.PHONY: status
status:
systemctl status --full --lines 999999 --no-pager --all borg-backup || true
# Watch live log output
.PHONY: tail
tail:
journalctl --all --follow --lines 200 --unit borg-backup
.PHONY: clean .PHONY: clean
clean: clean:
rm -f README.html rm -f README.html

174
Pipfile.lock generated
View File

@@ -18,10 +18,11 @@
"default": { "default": {
"bracex": { "bracex": {
"hashes": [ "hashes": [
"sha256:01f715cd0ed7a622ec8b32322e715813f7574de531f09b70f6f3b2c10f682425", "sha256:a27eaf1df42cf561fed58b7a8f3fdf129d1ea16a81e1fadd1d17989bc6384beb",
"sha256:64e2a6d14de9c8e022cf40539ac8468ba7c4b99550a2b05fc87fd20e392e568f" "sha256:efdc71eff95eaff5e0f8cfebe7d01adf2c8637c8c92edaf63ef348c241a82418"
], ],
"version": "==2.1.1" "markers": "python_version >= '3.8'",
"version": "==2.4"
}, },
"humanfriendly": { "humanfriendly": {
"hashes": [ "hashes": [
@@ -33,107 +34,126 @@
}, },
"pyyaml": { "pyyaml": {
"hashes": [ "hashes": [
"sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf", "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5",
"sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696", "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc",
"sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393", "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df",
"sha256:294db365efa064d00b8d1ef65d8ea2c3426ac366c0c4368d930bf1c5fb497f77", "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741",
"sha256:3b2b1824fe7112845700f815ff6a489360226a5609b96ec2190a45e62a9fc922", "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206",
"sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5", "sha256:18aeb1bf9a78867dc38b259769503436b7c72f7a1f1f4c93ff9a17de54319b27",
"sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8", "sha256:1d4c7e777c441b20e32f52bd377e0c409713e8bb1386e1099c2415f26e479595",
"sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10", "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62",
"sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc", "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98",
"sha256:5accb17103e43963b80e6f837831f38d314a0495500067cb25afab2e8d7a4018", "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696",
"sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e", "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290",
"sha256:6c78645d400265a062508ae399b60b8c167bf003db364ecb26dcab2bda048253", "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9",
"sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347", "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d",
"sha256:74c1485f7707cf707a7aef42ef6322b8f97921bd89be2ab6317fd782c2d53183", "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6",
"sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541", "sha256:4fb147e7a67ef577a588a0e2c17b6db51dda102c71de36f8549b6816a96e1867",
"sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb", "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47",
"sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185", "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486",
"sha256:bfb51918d4ff3d77c1c856a9699f8492c612cde32fd3bcd344af9be34999bfdc", "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6",
"sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db", "sha256:596106435fa6ad000c2991a98fa58eeb8656ef2325d7e158344fb33864ed87e3",
"sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa", "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007",
"sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46", "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938",
"sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122", "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0",
"sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b", "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c",
"sha256:e1d4970ea66be07ae37a3c2e48b5ec63f7ba6804bdddfdbd3cfd954d25a82e63", "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735",
"sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df", "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d",
"sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc", "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28",
"sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247", "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4",
"sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6", "sha256:9046c58c4395dff28dd494285c82ba00b546adfc7ef001486fbf0324bc174fba",
"sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0" "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8",
"sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5",
"sha256:afd7e57eddb1a54f0f1a974bc4391af8bcce0b444685d936840f125cf046d5bd",
"sha256:b1275ad35a5d18c62a7220633c913e1b42d44b46ee12554e5fd39c70a243d6a3",
"sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0",
"sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515",
"sha256:baa90d3f661d43131ca170712d903e6295d1f7a0f595074f151c0aed377c9b9c",
"sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c",
"sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924",
"sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34",
"sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43",
"sha256:c8098ddcc2a85b61647b2590f825f3db38891662cfc2fc776415143f599bb859",
"sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673",
"sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54",
"sha256:d858aa552c999bc8a8d57426ed01e40bef403cd8ccdd0fc5f6f04a00414cac2a",
"sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b",
"sha256:f003ed9ad21d6a4713f0a9b5a7a0a79e08dd0f221aff4525a2be4c346ee60aab",
"sha256:f22ac1c3cac4dbc50079e965eba2c1058622631e526bd9afd45fedd49ba781fa",
"sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c",
"sha256:fca0e3a251908a499833aa292323f32437106001d436eca0e6e7833256674585",
"sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d",
"sha256:fd66fc5d0da6d9815ba2cebeb4205f95818ff4b79c3ebe268e75d961704af52f"
], ],
"index": "pypi", "index": "pypi",
"version": "==5.4.1" "version": "==6.0.1"
}, },
"wcmatch": { "wcmatch": {
"hashes": [ "hashes": [
"sha256:4d54ddb506c90b5a5bba3a96a1cfb0bb07127909e19046a71d689ddfb18c3617", "sha256:14554e409b142edeefab901dc68ad570b30a72a8ab9a79106c5d5e9a6d241bd5",
"sha256:9146b1ab9354e0797ef6ef69bc89cb32cb9f46d1b9eeef69c559aeec8f3bffb6" "sha256:86c17572d0f75cbf3bcb1a18f3bf2f9e72b39a9c08c9b4a74e991e1882a8efb3"
], ],
"index": "pypi", "index": "pypi",
"version": "==8.2" "version": "==8.5"
} }
}, },
"develop": { "develop": {
"mypy": { "mypy": {
"hashes": [ "hashes": [
"sha256:088cd9c7904b4ad80bec811053272986611b84221835e079be5bcad029e79dd9", "sha256:19f905bcfd9e167159b3d63ecd8cb5e696151c3e59a1742e79bc3bcb540c42c7",
"sha256:0aadfb2d3935988ec3815952e44058a3100499f5be5b28c34ac9d79f002a4a9a", "sha256:21a1ad938fee7d2d96ca666c77b7c494c3c5bd88dff792220e1afbebb2925b5e",
"sha256:119bed3832d961f3a880787bf621634ba042cb8dc850a7429f643508eeac97b9", "sha256:40b1844d2e8b232ed92e50a4bd11c48d2daa351f9deee6c194b83bf03e418b0c",
"sha256:1a85e280d4d217150ce8cb1a6dddffd14e753a4e0c3cf90baabb32cefa41b59e", "sha256:41697773aa0bf53ff917aa077e2cde7aa50254f28750f9b88884acea38a16169",
"sha256:3c4b8ca36877fc75339253721f69603a9c7fdb5d4d5a95a1a1b899d8b86a4de2", "sha256:49ae115da099dcc0922a7a895c1eec82c1518109ea5c162ed50e3b3594c71208",
"sha256:3e382b29f8e0ccf19a2df2b29a167591245df90c0b5a2542249873b5c1d78212", "sha256:4c46b51de523817a0045b150ed11b56f9fff55f12b9edd0f3ed35b15a2809de0",
"sha256:42c266ced41b65ed40a282c575705325fa7991af370036d3f134518336636f5b", "sha256:4cbe68ef919c28ea561165206a2dcb68591c50f3bcf777932323bc208d949cf1",
"sha256:53fd2eb27a8ee2892614370896956af2ff61254c275aaee4c230ae771cadd885", "sha256:4d01c00d09a0be62a4ca3f933e315455bde83f37f892ba4b08ce92f3cf44bcc1",
"sha256:704098302473cb31a218f1775a873b376b30b4c18229421e9e9dc8916fd16150", "sha256:59a0d7d24dfb26729e0a068639a6ce3500e31d6655df8557156c51c1cb874ce7",
"sha256:7df1ead20c81371ccd6091fa3e2878559b5c4d4caadaf1a484cf88d93ca06703", "sha256:68351911e85145f582b5aa6cd9ad666c8958bcae897a1bfda8f4940472463c45",
"sha256:866c41f28cee548475f146aa4d39a51cf3b6a84246969f3759cb3e9c742fc072", "sha256:7274b0c57737bd3476d2229c6389b2ec9eefeb090bbaf77777e9d6b1b5a9d143",
"sha256:a155d80ea6cee511a3694b108c4494a39f42de11ee4e61e72bc424c490e46457", "sha256:81af8adaa5e3099469e7623436881eff6b3b06db5ef75e6f5b6d4871263547e5",
"sha256:adaeee09bfde366d2c13fe6093a7df5df83c9a2ba98638c7d76b010694db760e", "sha256:82e469518d3e9a321912955cc702d418773a2fd1e91c651280a1bda10622f02f",
"sha256:b6fb13123aeef4a3abbcfd7e71773ff3ff1526a7d3dc538f3929a49b42be03f0", "sha256:8b27958f8c76bed8edaa63da0739d76e4e9ad4ed325c814f9b3851425582a3cd",
"sha256:b94e4b785e304a04ea0828759172a15add27088520dc7e49ceade7834275bedb", "sha256:8c223fa57cb154c7eab5156856c231c3f5eace1e0bed9b32a24696b7ba3c3245",
"sha256:c0df2d30ed496a08de5daed2a9ea807d07c21ae0ab23acf541ab88c24b26ab97", "sha256:8f57e6b6927a49550da3d122f0cb983d400f843a8a82e65b3b380d3d7259468f",
"sha256:c6c2602dffb74867498f86e6129fd52a2770c48b7cd3ece77ada4fa38f94eba8", "sha256:925cd6a3b7b55dfba252b7c4561892311c5358c6b5a601847015a1ad4eb7d332",
"sha256:ceb6e0a6e27fb364fb3853389607cf7eb3a126ad335790fa1e14ed02fba50811", "sha256:a43ef1c8ddfdb9575691720b6352761f3f53d85f1b57d7745701041053deff30",
"sha256:d9dd839eb0dc1bbe866a288ba3c1afc33a202015d2ad83b31e875b5905a079b6", "sha256:a8032e00ce71c3ceb93eeba63963b864bf635a18f6c0c12da6c13c450eedb183",
"sha256:e4dab234478e3bd3ce83bac4193b2ecd9cf94e720ddd95ce69840273bf44f6de", "sha256:b96ae2c1279d1065413965c607712006205a9ac541895004a1e0d4f281f2ff9f",
"sha256:ec4e0cd079db280b6bdabdc807047ff3e199f334050db5cbb91ba3e959a67504", "sha256:bb8ccb4724f7d8601938571bf3f24da0da791fe2db7be3d9e79849cb64e0ae85",
"sha256:ecd2c3fe726758037234c93df7e98deb257fd15c24c9180dacf1ef829da5f921", "sha256:bbaf4662e498c8c2e352da5f5bca5ab29d378895fa2d980630656178bd607c46",
"sha256:ef565033fa5a958e62796867b1df10c40263ea9ded87164d67572834e57a174d" "sha256:cfd13d47b29ed3bbaafaff7d8b21e90d827631afda134836962011acb5904b71",
"sha256:d4473c22cc296425bbbce7e9429588e76e05bc7342da359d6520b6427bf76660",
"sha256:d8fbb68711905f8912e5af474ca8b78d077447d8f3918997fecbf26943ff3cbb",
"sha256:e5012e5cc2ac628177eaac0e83d622b2dd499e28253d4107a08ecc59ede3fc2c",
"sha256:eb4f18589d196a4cbe5290b435d135dee96567e07c2b2d43b5c4621b6501531a"
], ],
"index": "pypi", "index": "pypi",
"version": "==0.910" "version": "==1.6.1"
}, },
"mypy-extensions": { "mypy-extensions": {
"hashes": [ "hashes": [
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d", "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8" "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"
], ],
"version": "==0.4.3" "markers": "python_version >= '3.5'",
}, "version": "==1.0.0"
"toml": {
"hashes": [
"sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b",
"sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"
],
"version": "==0.10.2"
}, },
"types-pyyaml": { "types-pyyaml": {
"hashes": [ "hashes": [
"sha256:1d9e431e9f1f78a65ea957c558535a3b15ad67ea4912bce48a6c1b613dcf81ad", "sha256:334373d392fde0fdf95af5c3f1661885fa10c52167b14593eb856289e1855062",
"sha256:f1d1357168988e45fa20c65aecb3911462246a84809015dd889ebf8b1db74124" "sha256:c05bc6c158facb0676674b7f11fe3960db4f389718e19e62bd2b84d6205cfd24"
], ],
"index": "pypi", "index": "pypi",
"version": "==5.4.10" "version": "==6.0.12.12"
}, },
"typing-extensions": { "typing-extensions": {
"hashes": [ "hashes": [
"sha256:49f75d16ff11f1cd258e1b988ccff82a3ca5570217d7ad8c5f48205dd99a677e", "sha256:8f92fc8806f9a6b641eaa5318da32b44d401efaac0f6678c9bc448ba3605faa0",
"sha256:d8226d10bc02a29bcc81df19a26e56a9647f8b0a6d4a83924139f4a8b01f17b7", "sha256:df8e4339e9cb77357558cbdbceca33c303714cf861d1eef15e1070055ae8b7ef"
"sha256:f1d25edafde516b146ecd0613dabcc61409817af4766fbbcfb8d1ad4ec441a34"
], ],
"version": "==3.10.0.2" "markers": "python_version >= '3.8'",
"version": "==4.8.0"
} }
} }
} }

107
README.md
View File

@@ -1,107 +0,0 @@
Initial setup
=============
Run on client:
sudo git clone https://git.jim.sh/jim/borg-setup.git /opt/borg
sudo /opt/borg/initial-setup.sh
Customize `/opt/borg/config.yaml` as desired.
Cheat sheet
===========
*After setup, the copy of this file on the client will have the
variables in this section filled in automatically*
### Configuration
Hostname: ${HOSTNAME}
Base directory: ${BORG_DIR}
Destination: ${BACKUP_USER}@${BACKUP_HOST}
Repository: ${BACKUP_REPO}
### Commands
See when next backup is scheduled:
systemctl list-timers borg-backup.timer
See status of most recent backup:
systemctl status --full --lines 999999 --no-pager --all borg-backup
Watch log:
journalctl --all --follow --unit borg-backup
Start backup now:
sudo systemctl start borg-backup
Interrupt backup in progress:
sudo systemctl stop borg-backup
Show backups and related info:
sudo ${BORG_DIR}/borg.sh info
sudo ${BORG_DIR}/borg.sh list
Run Borg using the read-write SSH key:
sudo ${BORG_DIR}/borg.sh --rw list
Mount and look at files:
mkdir mnt
sudo ${BORG_DIR}/borg.sh mount :: mnt
sudo -s # to explore as root
sudo umount mnt
Prune old backups. Only run if sure local system was never compromised,
as object deletion could have been queued during append-only operations.
Requires SSH key password from bitwarden.
sudo ${BORG_DIR}/prune.sh
Design
======
- On server, we have a separate user account "jim-backups". Password
for this account is in bitwarden in the "Backups" folder, under `ssh
backup.jim.sh`.
- Repository keys are repokeys, which get stored on the server, inside
the repo. Passphrases are stored:
- on clients (in `/opt/borg/passphrase`, for making backups)
- in bitwarden (under `borg <hostname>`, user `repo key`)
- Each client has two SSH keys for connecting to the server:
- `/opt/borg/ssh/id_ecdsa_appendonly`
- configured on server for append-only operation
- used for making backups
- no password
- `/opt/borg/ssh/id_ecdsa`
- configured on server for read-write operation
- used for manual recovery, management, pruning
- password in bitwarden (under `borg [hostname]`, user `read-write ssh key`)
- Pruning requires the password and is a manual operation, and should only
be run when the client has not been compromised.
sudo /opt/borg/prune.sh
- Systemd timers start daily backups:
/etc/systemd/system/borg-backup.service -> /opt/borg/borg-backup.service
/etc/systemd/system/borg-backup.timer -> /opt/borg/borg-backup.timer
- Backup script `/opt/borg/backup.py` uses configuration in
`/opt/borg/config.yaml` to generate our own list of files, excluding
anything that's too large by default. This requires borg 1.2.0b1
or newer, which is why the setup scripts download a specific version.

429
backup.py
View File

@@ -8,10 +8,14 @@
import os import os
import re import re
import sys import sys
import json
import stat import stat
import time import time
import select
import pathlib import pathlib
import threading
import subprocess import subprocess
import _thread # for interrupt_main
import typing import typing
@@ -19,41 +23,31 @@ import yaml
import wcmatch.glob # type: ignore import wcmatch.glob # type: ignore
import humanfriendly # type: ignore import humanfriendly # type: ignore
def pstr(path: bytes) -> str: def b2s(raw: bytes) -> str:
return path.decode(errors='backslashreplace') return raw.decode(errors='backslashreplace')
def format_size(n: int) -> str:
return humanfriendly.format_size(n, keep_width=True, binary=True)
# Type corresponding to patterns that are generated by
# wcmatch.translate: two lists of compiled REs (a,b). A path matches
# if it matches at least one regex in "a" and none in "b".
MatchPatterns = typing.Tuple[typing.List[re.Pattern], typing.List[re.Pattern]]
class Config: class Config:
roots: list[bytes] roots: typing.List[bytes]
max_file_size: typing.Optional[int]
one_file_system: bool one_file_system: bool
exclude_caches: bool exclude_caches: bool
exclude: list[bytes] exclude: MatchPatterns
force_include: list[bytes] unexclude: MatchPatterns
max_size_rules: typing.List[typing.Tuple[int, MatchPatterns]]
notify_email: typing.Optional[str] notify_email: typing.Optional[str]
def __init__(self, configfile: str): def __init__(self, configfile: str):
# Read config
with open(configfile, 'r') as f:
config = yaml.safe_load(f)
self.one_file_system = config.get('one-file-system', False)
self.exclude_caches = config.get('exclude-caches', False)
if 'max-file-size' in config: # Helper to process lists of patterns into regexes
self.max_file_size = humanfriendly.parse_size( def process_match_list(config_entry):
config['max-file-size']) raw = config_entry.encode().split(b'\n')
else:
self.max_file_size = None
raw = config.get('roots', '').encode().split(b'\n')
self.roots = []
for x in raw:
if not len(x):
continue
self.roots.append(x)
self.roots.sort(key=len)
def process_match_list(config_name):
raw = config.get(config_name, '').encode().split(b'\n')
pats = [] pats = []
# Prepend '**/' to any relative patterns # Prepend '**/' to any relative patterns
for x in raw: for x in raw:
@@ -63,37 +57,49 @@ class Config:
pats.append(x) pats.append(x)
else: else:
pats.append(b'**/' + x) pats.append(b'**/' + x)
return pats
self.exclude = process_match_list('exclude') # Compile patterns.
self.force_include = process_match_list('force-include') (a, b) = wcmatch.glob.translate(
pats, flags=(wcmatch.glob.GLOBSTAR |
self.notify_email = config.get('notify-email', None)
# Compile patterns
flags = (wcmatch.glob.GLOBSTAR |
wcmatch.glob.DOTGLOB | wcmatch.glob.DOTGLOB |
wcmatch.glob.NODOTDIR | wcmatch.glob.NODOTDIR |
wcmatch.glob.EXTGLOB | wcmatch.glob.EXTGLOB |
wcmatch.glob.BRACE) wcmatch.glob.BRACE))
return ([ re.compile(x) for x in a ],
# Path matches if it matches at least one regex in "a" and no
# regex in "b"
(a, b) = wcmatch.glob.translate(self.exclude, flags=flags)
self.exclude_re = ([ re.compile(x) for x in a ],
[ re.compile(x) for x in b ]) [ re.compile(x) for x in b ])
(a, b) = wcmatch.glob.translate(self.force_include, flags=flags) # Read config
self.force_include_re = ([ re.compile(x) for x in a ], with open(configfile, 'r') as f:
[ re.compile(x) for x in b ]) config = yaml.safe_load(f)
self.one_file_system = config.get('one-file-system', False)
self.exclude_caches = config.get('exclude-caches', False)
def match_re(self, re: tuple[list[typing.Pattern], raw = config.get('roots', '').encode().split(b'\n')
list[typing.Pattern]], path: bytes): self.roots = []
for x in raw:
if not len(x):
continue
self.roots.append(x)
self.roots.sort(key=len)
self.exclude = process_match_list(config.get('exclude', ''))
self.unexclude = process_match_list(config.get('unexclude', ''))
self.max_size_rules = []
rules = { humanfriendly.parse_size(k): v
for k, v in config.get('max-size-rules', {}).items() }
for size in reversed(sorted(rules)):
self.max_size_rules.append(
(size, process_match_list(rules[size])))
self.notify_email = config.get('notify-email', None)
def match_re(self, r: MatchPatterns, path: bytes):
# Path matches if it matches at least one regex in # Path matches if it matches at least one regex in
# re[0] and no regex in re[1]. # r[0] and no regex in r[1].
for a in re[0]: for a in r[0]:
if a.match(path): if a.match(path):
for b in re[1]: for b in r[1]:
if b.match(path): if b.match(path):
return False return False
return True return True
@@ -103,28 +109,33 @@ class Backup:
def __init__(self, config: Config, dry_run: bool): def __init__(self, config: Config, dry_run: bool):
self.config = config self.config = config
self.dry_run = dry_run self.dry_run = dry_run
self.root_seen: dict[bytes, bool] = {} self.root_seen: typing.Dict[bytes, bool] = {}
# All logged messages, with severity # Saved log messages (which includes borg output)
self.logs: list[tuple[str, str]] = [] self.logs: typing.List[typing.Tuple[str, str]] = []
def out(self, path: bytes): def out(self, path: bytes):
self.outfile.write(path + (b'\n' if self.dry_run else b'\0')) self.outfile.write(path + (b'\n' if self.dry_run else b'\0'))
def log(self, letter: str, msg: str): def log(self, letter: str, msg: str, bold: bool=False):
colors = { 'E': 31, 'W': 33, 'I': 36 }; colors = {
if letter in colors: 'E': 31, # red: error
c = colors[letter] 'W': 33, # yellow: warning
else: 'N': 34, # blue: notice, a weaker warning (no email generated)
c = 0 'I': 36, # cyan: info, backup.py script output
sys.stderr.write(f"\033[1;{c}m{letter}:\033[22m {msg}\033[0m\n") 'O': 37, # white: regular output from borg
};
c = colors[letter] if letter in colors else 0
b = "" if bold else "\033[22m"
sys.stdout.write(f"\033[1;{c}m{letter}:{b} {msg}\033[0m\n")
sys.stdout.flush()
self.logs.append((letter, msg)) self.logs.append((letter, msg))
def run(self, outfile: typing.IO[bytes]): def run(self, outfile: typing.IO[bytes]):
self.outfile = outfile self.outfile = outfile
for root in self.config.roots: for root in self.config.roots:
if root in self.root_seen: if root in self.root_seen:
self.log('I', f"ignoring root, already seen: {pstr(root)}") self.log('I', f"ignoring root, already seen: {b2s(root)}")
continue continue
try: try:
@@ -132,13 +143,13 @@ class Backup:
if not stat.S_ISDIR(st.st_mode): if not stat.S_ISDIR(st.st_mode):
raise NotADirectoryError raise NotADirectoryError
except FileNotFoundError: except FileNotFoundError:
self.log('W', f"ignoring root, does not exist: {pstr(root)}") self.log('E', f"root does not exist: {b2s(root)}")
continue continue
except NotADirectoryError: except NotADirectoryError:
self.log('W', f"ignoring root, not a directory: {pstr(root)}") self.log('E', f"root is not a directory: {b2s(root)}")
continue continue
self.log('I', f"processing root {pstr(root)}") self.log('I', f"processing root {b2s(root)}")
self.scan(root) self.scan(root)
def scan(self, path: bytes, parent_st: os.stat_result=None): def scan(self, path: bytes, parent_st: os.stat_result=None):
@@ -159,7 +170,7 @@ class Backup:
# See if there's a reason to exclude it # See if there's a reason to exclude it
exclude_reason = None exclude_reason = None
if self.config.match_re(self.config.exclude_re, decorated_path): if self.config.match_re(self.config.exclude, decorated_path):
# Config file says to exclude # Config file says to exclude
exclude_reason = ('I', f"skipping, excluded by config file") exclude_reason = ('I', f"skipping, excluded by config file")
@@ -171,23 +182,27 @@ class Backup:
exclude_reason = ('I', "skipping, on different filesystem") exclude_reason = ('I', "skipping, on different filesystem")
elif (is_reg elif (is_reg
and self.config.max_file_size and len(self.config.max_size_rules)
and size > self.config.max_file_size): and size > self.config.max_size_rules[-1][0]):
# Too big # Check file sizes against our list.
def format_size(n): # Only need to check if the size is bigger than the smallest
return humanfriendly.format_size( # entry on the list; then, we need to check it against all rules
n, keep_width=True, binary=True) # to see which one applies.
for (max_size, patterns) in self.config.max_size_rules:
if self.config.match_re(patterns, decorated_path):
if size > max_size:
a = format_size(size) a = format_size(size)
b = format_size(self.config.max_file_size) b = format_size(max_size)
exclude_reason = ('W', f"file size {a} exceeds limit {b}") exclude_reason = (
'W', f"file size {a} exceeds limit {b}")
break
# If we have a reason to exclude it, stop now unless it's # If we have a reason to exclude it, stop now unless it's
# force-included # force-included
force = self.config.match_re(self.config.force_include_re, force = self.config.match_re(self.config.unexclude, decorated_path)
decorated_path)
if exclude_reason and not force: if exclude_reason and not force:
self.log(exclude_reason[0], self.log(exclude_reason[0],
f"{exclude_reason[1]}: {pstr(path)}") f"{exclude_reason[1]}: {b2s(path)}")
return return
# Print path for Borg # Print path for Borg
@@ -209,7 +224,7 @@ class Backup:
with open(path + b'/CACHEDIR.TAG', 'rb') as f: with open(path + b'/CACHEDIR.TAG', 'rb') as f:
if f.read(len(tag)) == tag: if f.read(len(tag)) == tag:
self.log( self.log(
'I', f"skipping, cache dir: {pstr(path)}") 'I', f"skipping, cache dir: {b2s(path)}")
return return
except: except:
pass pass
@@ -219,16 +234,136 @@ class Backup:
for entry in it: for entry in it:
self.scan(path=entry.path, parent_st=st) self.scan(path=entry.path, parent_st=st)
except PermissionError as e: except (FileNotFoundError,
self.log('E', f"can't read {pstr(path)}") IsADirectoryError,
NotADirectoryError,
PermissionError) as e:
self.log('E', f"can't read {b2s(path)}: {str(e)}")
return return
def main(argv: list[str]): def run_borg(self, argv: typing.List[str],
stdin_writer: typing.Callable[[typing.IO[bytes]],
typing.Any]=None):
"""Run a borg command, capturing and displaying output, while feeding
input using stdin_writer. Returns True on Borg success, False on error.
"""
borg = subprocess.Popen(argv,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
if borg.stdin is None:
raise Exception("no pipe")
# Count warnings and errors from Borg, so we can interpret its
# error codes correctly (e.g. ignoring exit codes if warnings
# were all harmless).
borg_saw_warnings = 0
borg_saw_errors = 0
# Use a thread to capture output
def reader_thread(fh):
nonlocal borg_saw_warnings
nonlocal borg_saw_errors
last_progress = 0
for line in fh:
try:
data = json.loads(line)
if data['type'] == 'log_message':
changed_msg = "file changed while we backed it up"
if data['levelname'] == 'WARNING':
if changed_msg in data['message']:
# harmless; don't count as a Borg warning
outlevel = 'N'
else:
borg_saw_warnings += 1
outlevel = 'W'
output = "warning: "
elif data['levelname'] not in ('DEBUG', 'INFO'):
borg_saw_errors += 1
outlevel = 'E'
output = "error: "
else:
outlevel = 'O'
output = ""
output += data['message']
elif (data['type'] == 'progress_message'
and 'message' in data):
outlevel = 'O'
output = data['message']
elif data['type'] == 'archive_progress':
now = time.time()
if now - last_progress > 10:
last_progress = now
def size(short: str, full: str) -> str:
return f" {short}={format_size(data[full])}"
outlevel = 'O'
output = (f"progress:" +
f" files={data['nfiles']}" +
size('orig', 'original_size') +
size('comp', 'compressed_size') +
size('dedup', 'deduplicated_size'))
else:
continue
else:
# ignore unknown progress line
continue
except Exception as e:
# on error, print raw line with exception
outlevel = 'E'
output = f"[exception: {str(e)}] " + b2s(line).rstrip()
self.log(outlevel, output)
fh.close()
def _reader_thread(fh):
try:
return reader_thread(fh)
except BrokenPipeError:
pass
except Exception:
_thread.interrupt_main()
reader = threading.Thread(target=_reader_thread, args=(borg.stdout,))
reader.daemon = True
reader.start()
try:
if stdin_writer:
# Give borg some time to start, just to clean up stdout
time.sleep(1)
stdin_writer(borg.stdin)
except BrokenPipeError:
self.log('E', "<broken pipe>")
finally:
try:
borg.stdin.close()
except BrokenPipeError:
pass
borg.wait()
reader.join()
ret = borg.returncode
if ret < 0:
self.log('E', f"borg exited with signal {-ret}")
elif ret == 2 or borg_saw_errors:
self.log('E', f"borg exited with errors (ret={ret})")
elif ret == 1:
if borg_saw_warnings:
self.log('W', f"borg exited with warnings (ret={ret})")
else:
return True
elif ret != 0:
self.log('E', f"borg exited with unknown error code {ret}")
else:
return True
return False
def main(argv: typing.List[str]):
import argparse import argparse
def humansize(string): def humansize(string):
return humanfriendly.parse_size(string) return humanfriendly.parse_size(string)
# Parse args
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(
prog=argv[0], prog=argv[0],
description="Back up the local system using borg", description="Back up the local system using borg",
@@ -237,8 +372,8 @@ def main(argv: list[str]):
base = pathlib.Path(__file__).parent base = pathlib.Path(__file__).parent
parser.add_argument('-c', '--config', parser.add_argument('-c', '--config',
help="Config file", default=str(base / "config.yaml")) help="Config file", default=str(base / "config.yaml"))
parser.add_argument('-b', '--borg', parser.add_argument('-v', '--vars',
help="Borg command", default=str(base / "borg.sh")) help="Variables file", default=str(base / "vars.sh"))
parser.add_argument('-n', '--dry-run', action="store_true", parser.add_argument('-n', '--dry-run', action="store_true",
help="Just print log output, don't run borg") help="Just print log output, don't run borg")
parser.add_argument('-d', '--debug', action="store_true", parser.add_argument('-d', '--debug', action="store_true",
@@ -246,18 +381,43 @@ def main(argv: list[str]):
args = parser.parse_args() args = parser.parse_args()
config = Config(args.config) config = Config(args.config)
backup = Backup(config, args.dry_run) backup = Backup(config, args.dry_run)
# Parse variables from vars.sh
hostname = os.uname().nodename
borg_sh = str(base / "borg.sh")
notify_sh = str(base / "notify.sh")
try:
with open(args.vars) as f:
for line in f:
m = re.match(r"\s*export\s*([A-Z_]+)=(.*)", line)
if not m:
continue
var = m.group(1)
value = m.group(2)
if var == "HOSTNAME":
hostname = value
if var == "BORG":
borg_sh = value
if var == "BORG_DIR":
notify_sh = str(pathlib.Path(value) / "notify.sh")
except Exception as e:
backup.log('W', f"failed to parse variables from {args.vars}: {str(e)}")
# Run backup
if args.dry_run: if args.dry_run:
if args.debug: if args.debug:
backup.run(sys.stdout.buffer) backup.run(sys.stdout.buffer)
else: else:
with open(os.devnull, "wb") as out: with open(os.devnull, "wb") as out:
backup.run(out) backup.run(out)
sys.stdout.flush()
else: else:
borg = subprocess.Popen([args.borg, if backup.run_borg([borg_sh,
"create", "create",
"--verbose", "--verbose",
"--progress",
"--log-json",
"--list", "--list",
"--filter", "E", "--filter", "E",
"--stats", "--stats",
@@ -265,30 +425,87 @@ def main(argv: list[str]):
"--compression", "zstd,3", "--compression", "zstd,3",
"--paths-from-stdin", "--paths-from-stdin",
"--paths-delimiter", "\\0", "--paths-delimiter", "\\0",
"::'{hostname}-{now:%Y%m%d-%H%M%S}'"], "::" + hostname + "-{now:%Y%m%d-%H%M%S}"],
stdin=subprocess.PIPE) stdin_writer=backup.run):
if borg.stdin is None:
raise Exception("no pipe")
try:
# Give borg some time to start, just to clean up stdout
time.sleep(2)
backup.run(borg.stdin)
except BrokenPipeError:
sys.stderr.write(f"broken pipe\n")
finally:
try:
borg.stdin.close()
except BrokenPipeError:
pass
borg.wait()
ret = borg.returncode
if ret < 0:
sys.stderr.write(f"error: process exited with signal {-ret}\n")
return 1
elif ret != 0:
sys.stderr.write(f"error: process exited with return code {ret}\n")
return ret
# backup success; run prune. Note that this won't actually free
# space until a "./borg.sh --rw compact", because we're in
# append-only mode.
backup.log('I', f"pruning archives", bold=True)
backup.run_borg([borg_sh,
"prune",
"--verbose",
"--list",
"--progress",
"--log-json",
"--stats",
"--keep-within=7d",
"--keep-daily=14",
"--keep-weekly=8",
"--keep-monthly=-1",
"--glob-archives", hostname + "-????????-??????"])
# See if we had any errors
warnings = sum(1 for (letter, msg) in backup.logs if letter == 'W')
errors = sum(1 for (letter, msg) in backup.logs if letter == 'E')
def plural(num: int, word: str) -> str:
suffix = "" if num == 1 else "s"
return f"{num} {word}{suffix}"
warnmsg = plural(warnings, "warning") if warnings else None
errmsg = plural(errors, "error") if errors else None
if not warnings and not errors:
backup.log('I', f"backup successful", bold=True)
else:
if warnmsg:
backup.log('W', f"reported {warnmsg}", bold=True)
if errors:
backup.log('E', f"reported {errmsg}", bold=True)
# Send a notification of errors
email = backup.config.notify_email
if email and not args.dry_run:
backup.log('I', f"sending error notification to {email}")
def write_logs(title, only_include=None):
body = [ title ]
for (letter, msg) in backup.logs:
if only_include and letter not in only_include:
continue
# Use a ":" prefix for warnings/errors/notices so that
# the mail reader highlights them.
if letter in "EWN":
prefix = ":"
else:
prefix = " "
body.append(f"{prefix}{letter}: {msg}")
return "\n".join(body).encode()
body_text = write_logs("Logged errors and warnings:", "EWN")
body_text += b"\n\n"
body_text += write_logs("All log messages:")
# Subject summary
if errmsg and warnmsg:
summary = f"{errmsg}, {warnmsg}"
elif errors:
summary = errmsg or ""
else:
summary = warnmsg or ""
# Call notify.sh
res = subprocess.run([notify_sh, summary, email], input=body_text)
if res.returncode != 0:
backup.log('E', f"failed to send notification")
errors += 1
# Exit with an error code if we had any errors
if errors:
return 1
return 0 return 0
if __name__ == "__main__": if __name__ == "__main__":

BIN
bin/borg.armv7l Executable file
View File

Binary file not shown.

BIN
bin/borg.x86_64 Executable file
View File

Binary file not shown.

12
borg.sh Executable file
View File

@@ -0,0 +1,12 @@
#!/bin/bash
set -e
. "$(dirname "$0")"/vars.sh
export BORG_PASSCOMMAND="cat ${BORG_DIR}/passphrase"
export BORG_BASE_DIR=${BORG_DIR}
export BORG_CACHE_DIR=${BORG_DIR}/cache
export BORG_CONFIG_DIR=${BORG_DIR}/config
export BORG_RSH="ssh -F $SSH/config -i $SSH/id_ecdsa_appendonly"
exec "${BORG_BIN}" "$@"

View File

@@ -4,19 +4,12 @@
roots: | roots: |
/ /
/boot /boot
/efi
/usr /usr
/var /var
one-file-system: true one-file-system: true
exclude-caches: true exclude-caches: true
# Files larger than this are excluded. If a large file isn't
# explicitly mentioned in "excludes" below, it also generates a
# warning. Note that this counts used blocks, so files with large
# holes will still be considered small (since they'll compress easily)
max-file-size: 500MiB
# Files/dirs to exclude from backup. # Files/dirs to exclude from backup.
# Relative paths are treated as if starting with **/ # Relative paths are treated as if starting with **/
# Paths ending in / will only match directories. # Paths ending in / will only match directories.
@@ -28,10 +21,21 @@ exclude: |
Steam/ubuntu*/ Steam/ubuntu*/
.cache/ .cache/
# Rules to exclude files based on file size.
# This is a dict of sizes, each with a list of rules.
# For a given path, the largest size with a matching rule applies.
# Matching follows the same behavior as the "exclude" list.
# Size is calculated as used blocks (think "du", not "du --apparent-size").
max-size-rules:
500 MiB: |
*
# 1.0 GiB: |
# *.mp4
# Files that are always included, even if they would have been # Files that are always included, even if they would have been
# excluded due to file size or the "exclude" list. # excluded due to file size or the "exclude" list.
# Matching rules are the same as above. # Matching follows the same behavior as the "exclude" list.
force-include: | unexclude: |
.git/objects/pack/*.pack .git/objects/pack/*.pack
# Email address for notification at end of backup # Email address for notification at end of backup

View File

@@ -1,24 +1,40 @@
#!/bin/bash #!/bin/bash
# These can be overridden when running this script # These can be overridden when running this script
set_default_variables()
{
HOSTNAME=${HOSTNAME:-$(hostname)}
BACKUP_HOST=${BACKUP_HOST:-backup.jim.sh} BACKUP_HOST=${BACKUP_HOST:-backup.jim.sh}
BACKUP_PORT=${BACKUP_PORT:-222}
BACKUP_USER=${BACKUP_USER:-jim-backups} BACKUP_USER=${BACKUP_USER:-jim-backups}
BACKUP_REPO=${BACKUP_REPO:-borg/$(hostname)} BACKUP_REPO=${BACKUP_REPO:-borg/${HOSTNAME}}
SYSTEMD_UNIT=${SYSTEMD_UNIT:-borg-backup}
# Borg binary and hash # Use stable host ID in case MAC address changes.
BORG_URL="https://github.com/borgbackup/borg/releases/download/1.2.0b3/borg-linux64" # Note that this host ID is only used to manage locks, so it's
BORG_SHA256=8dd6c2769d9bf3ca7a65ebf6781302029fc3b15105aff63d33195c007f897360 # not crucial that it remains stable.
UUID=$(python3 -c 'import uuid;print(uuid.getnode())')
HOSTID=${BORG_HOST_ID:-"${HOSTNAME}@${UUID}"}
log "Configuration:"
log " HOSTNAME Local hostname: \033[1m${HOSTNAME}"
log " HOSTID Local host ID: \033[1m${HOSTID}"
log " BACKUP_USER Backup server user: \033[1m${BACKUP_USER}"
log " BACKUP_HOST Backup server host: \033[1m${BACKUP_HOST}"
log " BACKUP_PORT Backup server port: \033[1m${BACKUP_PORT}"
log " BACKUP_REPO Server repository: \033[1m${BACKUP_REPO}"
log " SYSTEMD_UNIT Systemd unit name: \033[1m${SYSTEMD_UNIT}"
for i in $(seq 15 -1 1); do
printf "\rPress ENTER or wait $i seconds to continue... \b"
if read -t 1 ; then break; fi
done
}
# Main dir is where this repo was checked out # Main dir is where this repo was checked out
BORG_DIR="$(realpath "$(dirname "$0")")" BORG_DIR="$(realpath "$(dirname "$0")")"
cd "${BORG_DIR}" cd "${BORG_DIR}"
# This is named with uppercase so that it doesn't tab-complete for BORG_BIN="${BORG_DIR}/bin/borg.$(uname -m)"
# "./b<tab>", which should give us "./borg.sh"
BORG_BIN="${BORG_DIR}/Borg.bin"
# Use stable host ID in case MAC address changes
HOSTID="$(hostname -f)@$(python -c 'import uuid;print(uuid.getnode())')"
function error_handler() { function error_handler() {
echo "Error at $1 line $2:" echo "Error at $1 line $2:"
@@ -30,11 +46,100 @@ trap 'error_handler ${BASH_SOURCE} ${LINENO} $?' ERR
set -o errexit set -o errexit
set -o errtrace set -o errtrace
if [ -e ".setup-complete" ]; then # Create pip environment
echo "Error: BORG_DIR $BORG_DIR was already set up; giving up." setup_venv()
echo "Use \"git clean\" to return it to original state if desired" {
exit 1 if ! which pipenv >/dev/null 2>&1 ; then
error "pipenv not found, try: sudo apt install pipenv"
fi fi
mkdir -p .venv
pipenv install
}
# Create shell script with environment variables
create_borg_vars()
{
VARS=${BORG_DIR}/vars.sh
# These variables are used elsewhere in this script
BORG_REPO="ssh://${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}/./${BACKUP_REPO}"
BORG=${BORG_DIR}/borg.sh
SSH=$BORG_DIR/ssh
cat >"$VARS" <<EOF
export BACKUP_USER=${BACKUP_USER}
export BACKUP_HOST=${BACKUP_HOST}
export BACKUP_PORT=${BACKUP_PORT}
export BACKUP_REPO=${BACKUP_REPO}
export HOSTNAME=${HOSTNAME}
export BORG_REPO=${BORG_REPO}
export BORG_HOST_ID=${HOSTID}
export BORG_PASSCOMMAND="cat ${BORG_DIR}/passphrase"
export BORG_DIR=${BORG_DIR}
export SSH=${SSH}
export BORG=${BORG}
export BORG_BIN=${BORG_BIN}
export SYSTEMD_UNIT=${SYSTEMD_UNIT}
EOF
if ! "$BORG" -h >/dev/null ; then
error "Can't run the borg wrapper; does borg work?"
fi
}
# Copy templated files, filling in templates as needed
install_templated_files()
{
DOCS="README.md"
SCRIPTS="notify.sh logs.sh"
for i in ${DOCS} ${SCRIPTS}; do
sed -e "s!\${HOSTNAME}!${HOSTNAME}!g" \
-e "s!\${BORG_DIR}!${BORG_DIR}!g" \
-e "s!\${BORG_BIN}!${BORG_BIN}!g" \
-e "s!\${BACKUP_USER}!${BACKUP_USER}!g" \
-e "s!\${BACKUP_HOST}!${BACKUP_HOST}!g" \
-e "s!\${BACKUP_PORT}!${BACKUP_PORT}!g" \
-e "s!\${BACKUP_REPO}!${BACKUP_REPO}!g" \
-e "s!\${SYSTEMD_UNIT}!${SYSTEMD_UNIT}!g" \
templates/$i > $i
done
chmod +x ${SCRIPTS}
}
# Update local paths in scripts
update_paths()
{
sed -i\
-e "1c#!${BORG_DIR}/.venv/bin/python" \
backup.py
}
# See if we're just supposed to update an existing install, or recovering
parse_args()
{
RECOVER=0
UPDATE=0
if [ "$1" == "--recover" ] ; then
if [ -e "vars.sh" ]; then
error "It looks like this borg was already set up, can only recover from fresh start"
fi
RECOVER=1
elif [ "$1" == "--update-paths" ] || [ "$1" == "--update" ] ; then
if [ ! -e "vars.sh" ]; then
error "Can't update, not set up yet"
fi
UPDATE=1
elif [ -n "$1" ] ; then
error "Unknown arg $1"
elif [ -e "vars.sh" ]; then
warn "Error: BORG_DIR $BORG_DIR already looks set up."
warn "Use \"git clean\" to return it to original state if desired".
warn "Or specify --update to refresh things from latest git."
error "Giving up"
fi
}
# Make a temp dir to work in # Make a temp dir to work in
TMP=$(mktemp -d) TMP=$(mktemp -d)
@@ -69,57 +174,6 @@ notice() { msg 32 "$@" ; }
warn() { msg 31 "$@" ; } warn() { msg 31 "$@" ; }
error() { msg 31 "Error:" "$@" ; exit 1 ; } error() { msg 31 "Error:" "$@" ; exit 1 ; }
# Create pip environment
setup_venv()
{
mkdir .venv
pipenv install
}
# Install borg
install_borg()
{
curl -L --progress-bar -o "${BORG_BIN}" "${BORG_URL}"
if ! echo "${BORG_SHA256} ${BORG_BIN}" | sha256sum -c ; then
error "hash error"
fi
chmod +x "${BORG_BIN}"
}
# Create wrapper to execute borg
create_borg_wrapper()
{
BORG=${BORG_DIR}/borg.sh
BORG_REPO="ssh://${BACKUP_USER}@${BACKUP_HOST}/./${BACKUP_REPO}"
SSH=$BORG_DIR/ssh
cat >"$BORG" <<EOF
#!/bin/sh
export BORG_REPO=${BORG_REPO}
export BORG_PASSCOMMAND="cat ${BORG_DIR}/passphrase"
export BORG_HOST_ID=${HOSTID}
export BORG_BASE_DIR=${BORG_DIR}
export BORG_CACHE_DIR=${BORG_DIR}/cache
export BORG_CONFIG_DIR=${BORG_DIR}/config
if [ "\$1" = "--rw" ] ; then
echo "=== Need SSH key passphrase. Check Bitwarden for:"
echo "=== borg $(hostname) / read-write SSH key"
export BORG_RSH="ssh -F $SSH/config -o BatchMode=no -i $SSH/id_ecdsa"
shift
else
export BORG_RSH="ssh -F $SSH/config -i $SSH/id_ecdsa_appendonly"
fi
exec "${BORG_BIN}" "\$@"
EOF
chmod +x "$BORG"
if ! "$BORG" -h >/dev/null ; then
error "Can't run the new borg wrapper; does borg work?"
fi
}
print_random_key() print_random_key()
{ {
dd if=/dev/urandom | tr -dc 'a-zA-Z0-9' | head -c 16 dd if=/dev/urandom | tr -dc 'a-zA-Z0-9' | head -c 16
@@ -127,8 +181,18 @@ print_random_key()
generate_keys() generate_keys()
{ {
PASS_SSH=$(print_random_key) if [ $RECOVER -eq 1 ] ; then
notice "Recovering configuration in order to use an existing backup"
read -s -p "Repo key for \"borg ${HOSTNAME}\": " PASS_REPOKEY
echo
read -s -p "Again: " PASS_REPOKEY2
echo
if [ -z "$PASS_REPOKEY" ] || [ $PASS_REPOKEY != $PASS_REPOKEY2 ] ; then
error "Bad repo key"
fi
else
PASS_REPOKEY=$(print_random_key) PASS_REPOKEY=$(print_random_key)
fi
echo "$PASS_REPOKEY" > passphrase echo "$PASS_REPOKEY" > passphrase
chmod 600 passphrase chmod 600 passphrase
} }
@@ -148,8 +212,8 @@ configure_ssh()
log "Creating SSH keys" log "Creating SSH keys"
ssh-keygen -N "" -t ecdsa \ ssh-keygen -N "" -t ecdsa \
-C "backup-appendonly@$HOSTID" -f "$SSH/id_ecdsa_appendonly" -C "backup-appendonly@$HOSTID" -f "$SSH/id_ecdsa_appendonly"
ssh-keygen -N "$PASS_SSH" -t ecdsa \ ssh-keygen -N "" -t ecdsa \
-C "backup@$HOSTID" -f "$SSH/id_ecdsa" -C "backup-notify@$HOSTID" -f "$SSH/id_ecdsa_notify"
# Create config snippets # Create config snippets
log "Creating SSH config and wrapper script" log "Creating SSH config and wrapper script"
@@ -167,25 +231,29 @@ EOF
# Connect to backup host, using persistent control socket # Connect to backup host, using persistent control socket
log "Connecting to server" log "Connecting to server"
log "Please enter password; look in Bitwarden for: ${BACKUP_USER}@${BACKUP_HOST}" log "Please enter password; look in Bitwarden for: ssh ${BACKUP_HOST} / ${BACKUP_USER}"
ssh -F "$SSH/config" -o BatchMode=no -o PubkeyAuthentication=no \ ssh -F "$SSH/config" -o BatchMode=no -o PubkeyAuthentication=no \
-o ControlMaster=yes -o ControlPath="$TMP/ssh-control" \ -o ControlMaster=yes -o ControlPath="$TMP/ssh-control" \
-o StrictHostKeyChecking=accept-new \ -o StrictHostKeyChecking=accept-new \
-p "${BACKUP_PORT}" \
-f "${BACKUP_USER}@${BACKUP_HOST}" sleep 600 -f "${BACKUP_USER}@${BACKUP_HOST}" sleep 600
if ! run_ssh_command true >/dev/null 2>&1 </dev/null ; then if ! run_ssh_command true >/dev/null 2>&1 </dev/null ; then
error "SSH failed" error "SSH failed"
fi fi
log "Connected to ${BACKUP_USER}@${BACKUP_HOST}" log "Connected to ${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}"
# Since we now have an SSH connection, check that the repo doesn't exist # Since we now have an SSH connection, check repo existence
if run_ssh_command "test -e $BACKUP_REPO" ; then if [ $RECOVER -eq 0 ] && run_ssh_command "test -e $BACKUP_REPO"; then
error "$BACKUP_REPO already exists on the server, bailing out" error "$BACKUP_REPO already exists on the server, bailing out"
elif [ $RECOVER -ne 0 ] && ! run_ssh_command "test -e $BACKUP_REPO"; then
error "$BACKUP_REPO does NOT exist on the server, can't recover backup config"
fi fi
# Copy SSH keys to the server's authorized_keys file, removing any # Copy SSH keys to the server's authorized_keys file, removing any
# existing keys with this HOSTID. # existing keys with this HOSTID.
log "Setting up SSH keys on remote host" log "Setting up SSH keys on remote host"
cmd="borg/borg serve --restrict-to-repository ~/$BACKUP_REPO" REMOTE_BORG="borg/borg"
cmd="$REMOTE_BORG serve --restrict-to-repository ~/$BACKUP_REPO"
keys=".ssh/authorized_keys" keys=".ssh/authorized_keys"
backup="${keys}.old-$(date +%Y%m%d-%H%M%S)" backup="${keys}.old-$(date +%Y%m%d-%H%M%S)"
@@ -194,14 +262,14 @@ EOF
run_ssh_command "if cmp -s $backup $keys; then rm $backup ; fi" run_ssh_command "if cmp -s $backup $keys; then rm $backup ; fi"
run_ssh_command "cat >> .ssh/authorized_keys" <<EOF run_ssh_command "cat >> .ssh/authorized_keys" <<EOF
command="$cmd --append-only",restrict $(cat "$SSH/id_ecdsa_appendonly.pub") command="$cmd --append-only",restrict $(cat "$SSH/id_ecdsa_appendonly.pub")
command="borg/notify.sh",restrict $(cat "$SSH/id_ecdsa_appendonly.pub") command="borg/notify.sh",restrict $(cat "$SSH/id_ecdsa_notify.pub")
command="$cmd",restrict $(cat "$SSH/id_ecdsa.pub")
EOF EOF
# Test that everything worked # Test that everything worked
log "Testing SSH login with new key" log "Testing SSH login with new key"
if ! ssh -F "$SSH/config" -i "$SSH/id_ecdsa_appendonly" -T \ if ! ssh -F "$SSH/config" -i "$SSH/id_ecdsa_appendonly" -T \
"${BACKUP_USER}@${BACKUP_HOST}" borg --version </dev/null ; then -p "${BACKUP_PORT}" "${BACKUP_USER}@${BACKUP_HOST}" "$REMOTE_BORG" \
--version </dev/null ; then
error "Logging in with a key failed -- is server set up correctly?" error "Logging in with a key failed -- is server set up correctly?"
fi fi
log "Remote connection OK!" log "Remote connection OK!"
@@ -230,21 +298,27 @@ EOF
configure_systemd() configure_systemd()
{ {
TIMER=borg-backup.timer TIMER=${SYSTEMD_UNIT}.timer
SERVICE=borg-backup.service SERVICE=${SYSTEMD_UNIT}.service
TIMER_UNIT=${BORG_DIR}/${TIMER} TIMER_UNIT=${BORG_DIR}/${TIMER}
SERVICE_UNIT=${BORG_DIR}/${SERVICE} SERVICE_UNIT=${BORG_DIR}/${SERVICE}
log "Creating systemd files" log "Creating systemd files"
# Choose a time between 1am and 6am based on this hostname
HASH=$(echo hash of "$HOSTNAME" | sha1sum)
HOUR=$((0x${HASH:0:8} % 5 + 1))
MINUTE=$((0x${HASH:8:8} % 6 * 10))
TIME=$(printf %02d:%02d:00 $HOUR $MINUTE)
log "Backup time is $TIME"
cat > "$TIMER_UNIT" <<EOF cat > "$TIMER_UNIT" <<EOF
[Unit] [Unit]
Description=Borg backup to ${BACKUP_HOST} Description=Borg backup to ${BACKUP_HOST}
[Timer] [Timer]
OnCalendar=*-*-* 01:00:00 OnCalendar=*-*-* $TIME
RandomizedDelaySec=1800
FixedRandomDelay=true
Persistent=true Persistent=true
[Install] [Install]
@@ -261,8 +335,23 @@ ExecStart=${BORG_DIR}/backup.py
Nice=10 Nice=10
IOSchedulingClass=best-effort IOSchedulingClass=best-effort
IOSchedulingPriority=6 IOSchedulingPriority=6
Restart=on-failure
RestartSec=600
EOF EOF
if [ $RECOVER -eq 1 ] ; then
log "Partially setting up systemd"
ln -sfv "${TIMER_UNIT}" /etc/systemd/system
ln -sfv "${SERVICE_UNIT}" /etc/systemd/system
systemctl --no-ask-password daemon-reload
systemctl --no-ask-password stop ${TIMER}
systemctl --no-ask-password disable ${TIMER}
warn "Since we're recovering, systemd automatic backups aren't enabled"
warn "Do something like this to configure automatic backups:"
echo " sudo systemctl enable ${TIMER} &&"
echo " sudo systemctl start ${TIMER}"
warn ""
else
log "Setting up systemd" log "Setting up systemd"
if ( if (
ln -sfv "${TIMER_UNIT}" /etc/systemd/system && ln -sfv "${TIMER_UNIT}" /etc/systemd/system &&
@@ -272,7 +361,7 @@ EOF
systemctl --no-ask-password start ${TIMER} systemctl --no-ask-password start ${TIMER}
); then ); then
log "Backup timer installed:" log "Backup timer installed:"
systemctl list-timers ${TIMER} systemctl list-timers --no-pager ${TIMER}
else else
warn "" warn ""
warn "Systemd setup failed" warn "Systemd setup failed"
@@ -284,59 +373,66 @@ EOF
echo " sudo systemctl start ${TIMER}" echo " sudo systemctl start ${TIMER}"
warn "" warn ""
fi fi
} fi
update_paths()
{
sed -i \
-e "s!\${HOSTNAME}!$(hostname)!g" \
-e "s!\${BORG_DIR}!${BORG_DIR}!g" \
-e "s!\${BACKUP_USER}!${BACKUP_USER}!g" \
-e "s!\${BACKUP_HOST}!${BACKUP_HOST}!g" \
-e "s!\${BACKUP_REPO}!${BACKUP_REPO}!g" \
README.md
sed -i\
-e "1c#!${BORG_DIR}/.venv/bin/python" \
backup.py
} }
git_setup() git_setup()
{ {
if ! git checkout -b "setup-$(hostname)" ; then log "Committing local changes to git"
if ! git checkout -b "setup-${HOSTNAME}" ||
! git add ${SYSTEMD_UNIT}.service ${SYSTEMD_UNIT}.timer vars.sh ||
! git commit -a -m "autocommit after initial setup on ${HOSTNAME}" ; then
warn "Git setup failed; ignoring" warn "Git setup failed; ignoring"
return return
fi fi
log "Committing local changes to git"
git add README.md borg-backup.service borg-backup.timer borg.sh
git commit -a -m "autocommit after initial setup on $(hostname)"
} }
log "Configuration:" main() {
log " Backup server host: ${BACKUP_HOST}" if [ $UPDATE -eq 1 ] ; then
log " Backup server user: ${BACKUP_USER}" notice "Non-destructively updating paths, variables, and venv..."
log " Repository path: ${BACKUP_REPO}" source vars.sh
set_default_variables
install_templated_files
update_paths
setup_venv setup_venv
install_borg create_borg_vars
create_borg_wrapper
notice "Testing SSH"
ssh -F "$SSH/config" -i "$SSH/id_ecdsa_appendonly" \
-o BatchMode=no -o StrictHostKeyChecking=ask \
-p "${BACKUP_PORT}" "${BACKUP_USER}@${BACKUP_HOST}" info >/dev/null
notice "Testing borg: if host location changed, say 'y' here"
${BORG_DIR}/borg.sh info
notice "Done -- check 'git diff' and verify changes."
exit 0
fi
set_default_variables
setup_venv
create_borg_vars
generate_keys generate_keys
configure_ssh configure_ssh
create_repo [ $RECOVER -eq 0 ] && create_repo
export_keys export_keys
configure_systemd configure_systemd
install_templated_files
update_paths update_paths
git_setup git_setup
echo echo
notice "Add these two passwords to Bitwarden:" if [ $RECOVER -eq 1 ] ; then
notice "You should be set up with borg pointing to the existing repo now."
notice "Use commands like these to look at the backup:"
notice " sudo ${BORG_DIR}/borg.sh info"
notice " sudo ${BORG_DIR}/borg.sh list"
notice "You'll want to now restore files like ${BORG_DIR}/config.yaml before enabling systemd timers"
else
notice "Add this password to Bitwarden:"
notice "" notice ""
notice " Name: borg $(hostname)" notice " Name: borg ${HOSTNAME}"
notice " Username: read-write ssh key"
notice " Password: $PASS_SSH"
notice ""
notice " Name: borg $(hostname)"
notice " Username: repo key" notice " Username: repo key"
notice " Password: $PASS_REPOKEY" notice " Password: $PASS_REPOKEY"
notice "" notice ""
@@ -344,7 +440,12 @@ notice "Test the backup file list with"
notice " sudo ${BORG_DIR}/backup.py --dry-run" notice " sudo ${BORG_DIR}/backup.py --dry-run"
notice "and make any necessary adjustments to:" notice "and make any necessary adjustments to:"
notice " ${BORG_DIR}/config.yaml" notice " ${BORG_DIR}/config.yaml"
fi
echo echo
echo "All done" echo "All done"
}
parse_args "$@"
main

View File

@@ -1,12 +0,0 @@
#!/bin/bash
BORG="$(dirname "$0")/borg.sh --rw"
set -e
$BORG prune \
--verbose \
--stats \
--keep-within=7d \
--keep-daily=14 \
--keep-weekly=8 \
--keep-monthly=-1

145
templates/README.md Normal file
View File

@@ -0,0 +1,145 @@
Initial setup
=============
Run on client:
sudo git clone https://git.jim.sh/jim/borg-setup.git /opt/borg
sudo /opt/borg/initial-setup.sh
Customize `/opt/borg/config.yaml` as desired.
Cheat sheet
===========
*After setup, /opt/borg/README.md will have the variables in this
section filled in automatically*
## Configuration
Hostname: ${HOSTNAME}
Base directory: ${BORG_DIR}
Destination: ${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}
Repository: ${BACKUP_REPO}
## Commands
See when next backup is scheduled:
systemctl list-timers ${SYSTEMD_UNIT}.timer
See log of most recent backup attempt:
${BORG_DIR}/logs.sh
${BORG_DIR}/logs.sh -f
Start backup now:
sudo systemctl restart ${SYSTEMD_UNIT}
Interrupt backup in progress:
sudo systemctl stop ${SYSTEMD_UNIT}
Show backups and related info:
sudo ${BORG_DIR}/borg.sh info
sudo ${BORG_DIR}/borg.sh list
Run Borg using the read-write SSH key:
sudo ${BORG_DIR}/borg.sh --rw list
Mount and look at files:
mkdir mnt
sudo ${BORG_DIR}/borg.sh mount :: mnt
sudo -s # to explore as root
sudo umount mnt
Update borg-setup from git:
cd /opt/borg
sudo git remote update
sudo git rebase origin/master
sudo ./inital-setup.sh --update
sudo git commit -m 'Update borg-setup'
## Compaction and remote access
Old backups are "pruned" automatically, but because the SSH key is
append-only, no space is actually recovered on the server, it's just
marked for deletion. If you are sure that the client system was not
compromised, then you can run compaction manually directly on the
backup host by logging in via SSH (bitwarden `ssh ${BACKUP_HOST} /
${BACKUP_USER}`) and compacting there:
ssh -p ${BACKUP_PORT} ${BACKUP_USER}@${BACKUP_HOST} borg/borg compact --verbose --progress ${BACKUP_REPO}
This doesn't require the repo key. That key shouldn't be entered on
the untrusted backup host, so for operations that need it, use a
trusted host and run borg remotely instead, e.g.:
${BORG_BIN} --remote-path borg/borg info ssh://${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}/borg/${HOSTNAME}
The repo passphrase is in bitwarden `borg ${HOSTNAME} / repo key`.
Design
======
- On server, we have a separate user account "jim-backups". Password
for this account is in bitwarden in the "Backups" folder, under `ssh
backup.jim.sh`.
- Repository keys are repokeys, which get stored on the server, inside
the repo. Passphrases are stored:
- on clients (in `/opt/borg/passphrase`, for making backups)
- in bitwarden (under `borg <hostname>`, user `repo key`)
- Each client has two passwordless SSH keys for connecting to the server:
- `/opt/borg/ssh/id_ecdsa_appendonly`
- configured on server for append-only operation
- used for making backups
- `/opt/borg/ssh/id_ecdsa_notify`
- configured on server for running `borg/notify.sh` only
- used for sending email notifications on errors
- Systemd timers start daily backups:
/etc/systemd/system/borg-backup.service -> /opt/borg/borg-backup.service
/etc/systemd/system/borg-backup.timer -> /opt/borg/borg-backup.timer
- Backup script `/opt/borg/backup.py` uses configuration in
`/opt/borg/config.yaml` to generate our own list of files, excluding
anything that's too large by default. This requires borg 1.2 or newer.
Notes
=====
# Building Borg binary from git
sudo apt install python3.9 scons libacl1-dev libfuse-dev libpython3.9-dev patchelf
git clone https://github.com/borgbackup/borg.git
cd borg
virtualenv --python=python3.9 borg-env
source borg-env/bin/activate
pip install -r requirements.d/development.txt
pip install pyinstaller
pip install llfuse
pip install -e .[llfuse]
pyinstaller --clean --noconfirm scripts/borg.exe.spec
pip install staticx
# for x86
staticx -l /lib/x86_64-linux-gnu/libm.so.6 dist/borg.exe borg.x86_64
# for ARM; see https://github.com/JonathonReinhart/staticx/issues/209
staticx -l /lib/arm-linux-gnueabihf/libm.so.6 dist/borg.exe borg.armv7l
Then run `borg.x86_64`. Confirm the version with `borg.armv7l --version`.
*Note:* This uses the deprecated `llfuse` instead of the newer `pyfuse3`.
`pyfuse3` doesn't work because, at minimum, it pulls in `trio` which
requires `ssl` which is explicitly excluded by `scripts/borg.exe.spec`.

3
templates/logs.sh Normal file
View File

@@ -0,0 +1,3 @@
#!/bin/bash
exec journalctl --all --unit ${SYSTEMD_UNIT} --since "$(systemctl show -P ExecMainStartTimestamp ${SYSTEMD_UNIT})" "$@"

27
templates/notify.sh Normal file
View File

@@ -0,0 +1,27 @@
#!/bin/bash
set -e
. "$(dirname "$0")"/vars.sh
# Send notification email using a script on the backup host
# First argument is our hostname, second argument is destination;
# mail body is provided on stdin.
if tty -s ; then
echo 'Refusing to read mail body from terminal'
exit 1
fi
SUMMARY="$1"
EMAIL="$2"
# Remote notify.sh wants subject as first line, not as an argument,
# since it's a bit messy to pass complex strings through ssh command
# lines.
( echo "backup $HOSTNAME: $SUMMARY" ; cat ) | \
ssh \
-F "$SSH/config" \
-i "$SSH/id_ecdsa_notify" \
-p "$BACKUP_PORT" \
"$BACKUP_USER@$BACKUP_HOST" \
borg/notify.sh "$EMAIL"