Compare commits

..

29 Commits

Author SHA1 Message Date
a1cf833079 initial-setup: include systemd unit in vars 2025-03-15 12:52:39 -04:00
2024519b16 initial-setup: fix script permissions 2025-03-15 11:12:53 -04:00
90e9310ae0 Update README, add logs.sh 2025-03-15 11:08:52 -04:00
a9ab963d49 initial-setup: support different systemd unit names 2025-03-14 23:34:32 -04:00
eec423bc6f initial-setup: bugfixes 2025-03-14 15:00:29 -04:00
8caf8e04fb initial-setup: clean up, add PORT option, improve --update 2025-03-14 14:16:18 -04:00
b16c4d7330 Support recovery by setting up an existing repo 2024-07-01 14:24:28 -04:00
59ceff6d1a Update Pipfile.lock for newer Cython compatibility 2023-11-03 13:52:32 -04:00
9703c5fc72 bin: rebuild borg.x86_64 with staticx 0.13.8 2022-09-17 11:55:44 -04:00
c68b867b50 Add notes about host ID 2021-11-16 10:11:08 -05:00
7dea155f58 use python3 when getting UUID 2021-11-16 09:55:58 -05:00
f6e8863128 backup: adjust email formatting 2021-10-26 21:37:53 -04:00
342e2cd0e8 Update README 2021-10-26 16:03:52 -04:00
f14b0d2d4d backup.py: fix notification error 2021-10-26 16:00:46 -04:00
b74f9c75a2 initial-setup: fix --update 2021-10-26 15:55:32 -04:00
9b38c248d8 borg: add ARM binary for Pi; update scripts to use it 2021-10-26 15:50:08 -04:00
46f9f98860 backup: show errors at top of email notification 2021-10-26 13:24:39 -04:00
dc7d72b2da initial-setup: make systemd units restart on failure 2021-10-26 13:24:28 -04:00
cb12e09c46 backup: rework output to make notification emails easier to read 2021-10-26 13:20:21 -04:00
1115d1f821 backup: rename pstr() helper to b2s()
The helper is just bytes->str conversion with errors=backslashreplace,
which we can use for more than just paths.
2021-10-26 12:54:41 -04:00
e2f92ccb7a readme: fix typo 2021-10-19 14:48:28 -04:00
a15cb5b07d all: remove concept of read-write key
We don't need a read-write key: we can just SSH directly to
jim-backups@backup.jim.sh instead and run commands thay way.
Remove read-write key and document it in the README.

Also add some tools to update the README variables on updates.
2021-10-19 14:46:34 -04:00
51c5b5e9ca backup: fix prune archive name 2021-10-19 12:28:20 -04:00
ed8ea15aa7 backup: only prune archives that match default naming pattern 2021-10-19 12:18:46 -04:00
481e01896b backup: fix issue with ignoring "changed while we backed it up" warnings 2021-10-19 12:14:42 -04:00
e85e08cace backup: call prune after backup; add run_borg helper
Automatically prunes after backup, although this doesn't actually
free up space (because we're in append-only mode).
2021-10-19 11:16:17 -04:00
4b7802ad5f backup: flush stderr after all writes 2021-10-18 19:35:39 -04:00
4a30b82e39 backup: replace simple max size with rule-based system
Now individual files or patterns can have their own maximum sizes.
2021-10-18 17:43:33 -04:00
ac12b42cad backup: rename force-include to unexclude
Force-include is a misnomer because it won't include files
that weren't considered at all (like files in an excluded subdir).
Instead, call it "unexclude" to make it slightly clearer that this
will just override the exclusions.
2021-10-18 16:25:23 -04:00
15 changed files with 752 additions and 579 deletions

1
.gitea/README.md Symbolic link
View File

@@ -0,0 +1 @@
../templates/README.md

2
.gitignore vendored
View File

@@ -5,3 +5,5 @@ config/
key.txt
passphrase
ssh/
/README.md
/notify.sh

View File

@@ -38,7 +38,9 @@ rebase:
git pull
git checkout -
git rebase master
./initial-setup.sh --update
systemctl daemon-reload
git status
# Show status of most recent backup run
.PHONY: status

174
Pipfile.lock generated
View File

@@ -18,10 +18,11 @@
"default": {
"bracex": {
"hashes": [
"sha256:01f715cd0ed7a622ec8b32322e715813f7574de531f09b70f6f3b2c10f682425",
"sha256:64e2a6d14de9c8e022cf40539ac8468ba7c4b99550a2b05fc87fd20e392e568f"
"sha256:a27eaf1df42cf561fed58b7a8f3fdf129d1ea16a81e1fadd1d17989bc6384beb",
"sha256:efdc71eff95eaff5e0f8cfebe7d01adf2c8637c8c92edaf63ef348c241a82418"
],
"version": "==2.1.1"
"markers": "python_version >= '3.8'",
"version": "==2.4"
},
"humanfriendly": {
"hashes": [
@@ -33,107 +34,126 @@
},
"pyyaml": {
"hashes": [
"sha256:08682f6b72c722394747bddaf0aa62277e02557c0fd1c42cb853016a38f8dedf",
"sha256:0f5f5786c0e09baddcd8b4b45f20a7b5d61a7e7e99846e3c799b05c7c53fa696",
"sha256:129def1b7c1bf22faffd67b8f3724645203b79d8f4cc81f674654d9902cb4393",
"sha256:294db365efa064d00b8d1ef65d8ea2c3426ac366c0c4368d930bf1c5fb497f77",
"sha256:3b2b1824fe7112845700f815ff6a489360226a5609b96ec2190a45e62a9fc922",
"sha256:3bd0e463264cf257d1ffd2e40223b197271046d09dadf73a0fe82b9c1fc385a5",
"sha256:4465124ef1b18d9ace298060f4eccc64b0850899ac4ac53294547536533800c8",
"sha256:49d4cdd9065b9b6e206d0595fee27a96b5dd22618e7520c33204a4a3239d5b10",
"sha256:4e0583d24c881e14342eaf4ec5fbc97f934b999a6828693a99157fde912540cc",
"sha256:5accb17103e43963b80e6f837831f38d314a0495500067cb25afab2e8d7a4018",
"sha256:607774cbba28732bfa802b54baa7484215f530991055bb562efbed5b2f20a45e",
"sha256:6c78645d400265a062508ae399b60b8c167bf003db364ecb26dcab2bda048253",
"sha256:72a01f726a9c7851ca9bfad6fd09ca4e090a023c00945ea05ba1638c09dc3347",
"sha256:74c1485f7707cf707a7aef42ef6322b8f97921bd89be2ab6317fd782c2d53183",
"sha256:895f61ef02e8fed38159bb70f7e100e00f471eae2bc838cd0f4ebb21e28f8541",
"sha256:8c1be557ee92a20f184922c7b6424e8ab6691788e6d86137c5d93c1a6ec1b8fb",
"sha256:bb4191dfc9306777bc594117aee052446b3fa88737cd13b7188d0e7aa8162185",
"sha256:bfb51918d4ff3d77c1c856a9699f8492c612cde32fd3bcd344af9be34999bfdc",
"sha256:c20cfa2d49991c8b4147af39859b167664f2ad4561704ee74c1de03318e898db",
"sha256:cb333c16912324fd5f769fff6bc5de372e9e7a202247b48870bc251ed40239aa",
"sha256:d2d9808ea7b4af864f35ea216be506ecec180628aced0704e34aca0b040ffe46",
"sha256:d483ad4e639292c90170eb6f7783ad19490e7a8defb3e46f97dfe4bacae89122",
"sha256:dd5de0646207f053eb0d6c74ae45ba98c3395a571a2891858e87df7c9b9bd51b",
"sha256:e1d4970ea66be07ae37a3c2e48b5ec63f7ba6804bdddfdbd3cfd954d25a82e63",
"sha256:e4fac90784481d221a8e4b1162afa7c47ed953be40d31ab4629ae917510051df",
"sha256:fa5ae20527d8e831e8230cbffd9f8fe952815b2b7dae6ffec25318803a7528fc",
"sha256:fd7f6999a8070df521b6384004ef42833b9bd62cfee11a09bda1079b4b704247",
"sha256:fdc842473cd33f45ff6bce46aea678a54e3d21f1b61a7750ce3c498eedfe25d6",
"sha256:fe69978f3f768926cfa37b867e3843918e012cf83f680806599ddce33c2c68b0"
"sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5",
"sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc",
"sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df",
"sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741",
"sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206",
"sha256:18aeb1bf9a78867dc38b259769503436b7c72f7a1f1f4c93ff9a17de54319b27",
"sha256:1d4c7e777c441b20e32f52bd377e0c409713e8bb1386e1099c2415f26e479595",
"sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62",
"sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98",
"sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696",
"sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290",
"sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9",
"sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d",
"sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6",
"sha256:4fb147e7a67ef577a588a0e2c17b6db51dda102c71de36f8549b6816a96e1867",
"sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47",
"sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486",
"sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6",
"sha256:596106435fa6ad000c2991a98fa58eeb8656ef2325d7e158344fb33864ed87e3",
"sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007",
"sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938",
"sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0",
"sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c",
"sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735",
"sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d",
"sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28",
"sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4",
"sha256:9046c58c4395dff28dd494285c82ba00b546adfc7ef001486fbf0324bc174fba",
"sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8",
"sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5",
"sha256:afd7e57eddb1a54f0f1a974bc4391af8bcce0b444685d936840f125cf046d5bd",
"sha256:b1275ad35a5d18c62a7220633c913e1b42d44b46ee12554e5fd39c70a243d6a3",
"sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0",
"sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515",
"sha256:baa90d3f661d43131ca170712d903e6295d1f7a0f595074f151c0aed377c9b9c",
"sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c",
"sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924",
"sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34",
"sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43",
"sha256:c8098ddcc2a85b61647b2590f825f3db38891662cfc2fc776415143f599bb859",
"sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673",
"sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54",
"sha256:d858aa552c999bc8a8d57426ed01e40bef403cd8ccdd0fc5f6f04a00414cac2a",
"sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b",
"sha256:f003ed9ad21d6a4713f0a9b5a7a0a79e08dd0f221aff4525a2be4c346ee60aab",
"sha256:f22ac1c3cac4dbc50079e965eba2c1058622631e526bd9afd45fedd49ba781fa",
"sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c",
"sha256:fca0e3a251908a499833aa292323f32437106001d436eca0e6e7833256674585",
"sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d",
"sha256:fd66fc5d0da6d9815ba2cebeb4205f95818ff4b79c3ebe268e75d961704af52f"
],
"index": "pypi",
"version": "==5.4.1"
"version": "==6.0.1"
},
"wcmatch": {
"hashes": [
"sha256:4d54ddb506c90b5a5bba3a96a1cfb0bb07127909e19046a71d689ddfb18c3617",
"sha256:9146b1ab9354e0797ef6ef69bc89cb32cb9f46d1b9eeef69c559aeec8f3bffb6"
"sha256:14554e409b142edeefab901dc68ad570b30a72a8ab9a79106c5d5e9a6d241bd5",
"sha256:86c17572d0f75cbf3bcb1a18f3bf2f9e72b39a9c08c9b4a74e991e1882a8efb3"
],
"index": "pypi",
"version": "==8.2"
"version": "==8.5"
}
},
"develop": {
"mypy": {
"hashes": [
"sha256:088cd9c7904b4ad80bec811053272986611b84221835e079be5bcad029e79dd9",
"sha256:0aadfb2d3935988ec3815952e44058a3100499f5be5b28c34ac9d79f002a4a9a",
"sha256:119bed3832d961f3a880787bf621634ba042cb8dc850a7429f643508eeac97b9",
"sha256:1a85e280d4d217150ce8cb1a6dddffd14e753a4e0c3cf90baabb32cefa41b59e",
"sha256:3c4b8ca36877fc75339253721f69603a9c7fdb5d4d5a95a1a1b899d8b86a4de2",
"sha256:3e382b29f8e0ccf19a2df2b29a167591245df90c0b5a2542249873b5c1d78212",
"sha256:42c266ced41b65ed40a282c575705325fa7991af370036d3f134518336636f5b",
"sha256:53fd2eb27a8ee2892614370896956af2ff61254c275aaee4c230ae771cadd885",
"sha256:704098302473cb31a218f1775a873b376b30b4c18229421e9e9dc8916fd16150",
"sha256:7df1ead20c81371ccd6091fa3e2878559b5c4d4caadaf1a484cf88d93ca06703",
"sha256:866c41f28cee548475f146aa4d39a51cf3b6a84246969f3759cb3e9c742fc072",
"sha256:a155d80ea6cee511a3694b108c4494a39f42de11ee4e61e72bc424c490e46457",
"sha256:adaeee09bfde366d2c13fe6093a7df5df83c9a2ba98638c7d76b010694db760e",
"sha256:b6fb13123aeef4a3abbcfd7e71773ff3ff1526a7d3dc538f3929a49b42be03f0",
"sha256:b94e4b785e304a04ea0828759172a15add27088520dc7e49ceade7834275bedb",
"sha256:c0df2d30ed496a08de5daed2a9ea807d07c21ae0ab23acf541ab88c24b26ab97",
"sha256:c6c2602dffb74867498f86e6129fd52a2770c48b7cd3ece77ada4fa38f94eba8",
"sha256:ceb6e0a6e27fb364fb3853389607cf7eb3a126ad335790fa1e14ed02fba50811",
"sha256:d9dd839eb0dc1bbe866a288ba3c1afc33a202015d2ad83b31e875b5905a079b6",
"sha256:e4dab234478e3bd3ce83bac4193b2ecd9cf94e720ddd95ce69840273bf44f6de",
"sha256:ec4e0cd079db280b6bdabdc807047ff3e199f334050db5cbb91ba3e959a67504",
"sha256:ecd2c3fe726758037234c93df7e98deb257fd15c24c9180dacf1ef829da5f921",
"sha256:ef565033fa5a958e62796867b1df10c40263ea9ded87164d67572834e57a174d"
"sha256:19f905bcfd9e167159b3d63ecd8cb5e696151c3e59a1742e79bc3bcb540c42c7",
"sha256:21a1ad938fee7d2d96ca666c77b7c494c3c5bd88dff792220e1afbebb2925b5e",
"sha256:40b1844d2e8b232ed92e50a4bd11c48d2daa351f9deee6c194b83bf03e418b0c",
"sha256:41697773aa0bf53ff917aa077e2cde7aa50254f28750f9b88884acea38a16169",
"sha256:49ae115da099dcc0922a7a895c1eec82c1518109ea5c162ed50e3b3594c71208",
"sha256:4c46b51de523817a0045b150ed11b56f9fff55f12b9edd0f3ed35b15a2809de0",
"sha256:4cbe68ef919c28ea561165206a2dcb68591c50f3bcf777932323bc208d949cf1",
"sha256:4d01c00d09a0be62a4ca3f933e315455bde83f37f892ba4b08ce92f3cf44bcc1",
"sha256:59a0d7d24dfb26729e0a068639a6ce3500e31d6655df8557156c51c1cb874ce7",
"sha256:68351911e85145f582b5aa6cd9ad666c8958bcae897a1bfda8f4940472463c45",
"sha256:7274b0c57737bd3476d2229c6389b2ec9eefeb090bbaf77777e9d6b1b5a9d143",
"sha256:81af8adaa5e3099469e7623436881eff6b3b06db5ef75e6f5b6d4871263547e5",
"sha256:82e469518d3e9a321912955cc702d418773a2fd1e91c651280a1bda10622f02f",
"sha256:8b27958f8c76bed8edaa63da0739d76e4e9ad4ed325c814f9b3851425582a3cd",
"sha256:8c223fa57cb154c7eab5156856c231c3f5eace1e0bed9b32a24696b7ba3c3245",
"sha256:8f57e6b6927a49550da3d122f0cb983d400f843a8a82e65b3b380d3d7259468f",
"sha256:925cd6a3b7b55dfba252b7c4561892311c5358c6b5a601847015a1ad4eb7d332",
"sha256:a43ef1c8ddfdb9575691720b6352761f3f53d85f1b57d7745701041053deff30",
"sha256:a8032e00ce71c3ceb93eeba63963b864bf635a18f6c0c12da6c13c450eedb183",
"sha256:b96ae2c1279d1065413965c607712006205a9ac541895004a1e0d4f281f2ff9f",
"sha256:bb8ccb4724f7d8601938571bf3f24da0da791fe2db7be3d9e79849cb64e0ae85",
"sha256:bbaf4662e498c8c2e352da5f5bca5ab29d378895fa2d980630656178bd607c46",
"sha256:cfd13d47b29ed3bbaafaff7d8b21e90d827631afda134836962011acb5904b71",
"sha256:d4473c22cc296425bbbce7e9429588e76e05bc7342da359d6520b6427bf76660",
"sha256:d8fbb68711905f8912e5af474ca8b78d077447d8f3918997fecbf26943ff3cbb",
"sha256:e5012e5cc2ac628177eaac0e83d622b2dd499e28253d4107a08ecc59ede3fc2c",
"sha256:eb4f18589d196a4cbe5290b435d135dee96567e07c2b2d43b5c4621b6501531a"
],
"index": "pypi",
"version": "==0.910"
"version": "==1.6.1"
},
"mypy-extensions": {
"hashes": [
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"
"sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d",
"sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"
],
"version": "==0.4.3"
},
"toml": {
"hashes": [
"sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b",
"sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"
],
"version": "==0.10.2"
"markers": "python_version >= '3.5'",
"version": "==1.0.0"
},
"types-pyyaml": {
"hashes": [
"sha256:1d9e431e9f1f78a65ea957c558535a3b15ad67ea4912bce48a6c1b613dcf81ad",
"sha256:f1d1357168988e45fa20c65aecb3911462246a84809015dd889ebf8b1db74124"
"sha256:334373d392fde0fdf95af5c3f1661885fa10c52167b14593eb856289e1855062",
"sha256:c05bc6c158facb0676674b7f11fe3960db4f389718e19e62bd2b84d6205cfd24"
],
"index": "pypi",
"version": "==5.4.10"
"version": "==6.0.12.12"
},
"typing-extensions": {
"hashes": [
"sha256:49f75d16ff11f1cd258e1b988ccff82a3ca5570217d7ad8c5f48205dd99a677e",
"sha256:d8226d10bc02a29bcc81df19a26e56a9647f8b0a6d4a83924139f4a8b01f17b7",
"sha256:f1d25edafde516b146ecd0613dabcc61409817af4766fbbcfb8d1ad4ec441a34"
"sha256:8f92fc8806f9a6b641eaa5318da32b44d401efaac0f6678c9bc448ba3605faa0",
"sha256:df8e4339e9cb77357558cbdbceca33c303714cf861d1eef15e1070055ae8b7ef"
],
"version": "==3.10.0.2"
"markers": "python_version >= '3.8'",
"version": "==4.8.0"
}
}
}

133
README.md
View File

@@ -1,133 +0,0 @@
Initial setup
=============
Run on client:
sudo git clone https://git.jim.sh/jim/borg-setup.git /opt/borg
sudo /opt/borg/initial-setup.sh
Customize `/opt/borg/config.yaml` as desired.
Cheat sheet
===========
*After setup, the copy of this file on the client will have the
variables in this section filled in automatically*
### Configuration
Hostname: ${HOSTNAME}
Base directory: ${BORG_DIR}
Destination: ${BACKUP_USER}@${BACKUP_HOST}
Repository: ${BACKUP_REPO}
### Commands
See when next backup is scheduled:
systemctl list-timers borg-backup.timer
See status of most recent backup:
systemctl status --full --lines 999999 --no-pager --all borg-backup
Watch log:
journalctl --all --follow --unit borg-backup
Start backup now:
sudo systemctl start borg-backup
Interrupt backup in progress:
sudo systemctl stop borg-backup
Show backups and related info:
sudo ${BORG_DIR}/borg.sh info
sudo ${BORG_DIR}/borg.sh list
Run Borg using the read-write SSH key:
sudo ${BORG_DIR}/borg.sh --rw list
Mount and look at files:
mkdir mnt
sudo ${BORG_DIR}/borg.sh mount :: mnt
sudo -s # to explore as root
sudo umount mnt
Prune old backups. Only run if sure local system was never compromised,
as object deletion could have been queued during append-only operations.
Requires SSH key password from bitwarden.
sudo ${BORG_DIR}/prune.sh
Design
======
- On server, we have a separate user account "jim-backups". Password
for this account is in bitwarden in the "Backups" folder, under `ssh
backup.jim.sh`.
- Repository keys are repokeys, which get stored on the server, inside
the repo. Passphrases are stored:
- on clients (in `/opt/borg/passphrase`, for making backups)
- in bitwarden (under `borg <hostname>`, user `repo key`)
- Each client has two SSH keys for connecting to the server:
- `/opt/borg/ssh/id_ecdsa_appendonly`
- configured on server for append-only operation
- used for making backups
- no password
- `/opt/borg/ssh/id_ecdsa`
- configured on server for read-write operation
- used for manual recovery, management, pruning
- password in bitwarden (under `borg <hostname>`, user `read-write ssh key`)
- Pruning requires the password and is a manual operation, and should only
be run when the client has not been compromised.
sudo /opt/borg/prune.sh
- Systemd timers start daily backups:
/etc/systemd/system/borg-backup.service -> /opt/borg/borg-backup.service
/etc/systemd/system/borg-backup.timer -> /opt/borg/borg-backup.timer
- Backup script `/opt/borg/backup.py` uses configuration in
`/opt/borg/config.yaml` to generate our own list of files, excluding
anything that's too large by default. This requires borg 1.2.0b1
or newer.
Notes
=====
# Building Borg.bin binary from git
git clone https://github.com/borgbackup/borg.git
cd borg
virtualenv --python=python3 borg-env
source borg-env/bin/activate
pip install -r requirements.d/development.txt
pip install pyinstaller
pip install llfuse
pip install -e .[llfuse]
pyinstaller --clean --noconfirm scripts/borg.exe.spec
pip install staticx
staticx -l /lib/x86_64-linux-gnu/libm.so.6 dist/borg.exe Borg.bin
Then see `dist/borg.exe`. Confirm the version with `dist/borg.exe --version`.
*Note:* This uses the deprecated `llfuse` instead of the newer `pyfuse3`.
`pyfuse3` doesn't work because, at minimum, it pulls in `trio` which
requires `ssl` which is explicitly excluded by
`scripts/borg.exe.spec`.

437
backup.py
View File

@@ -23,44 +23,31 @@ import yaml
import wcmatch.glob # type: ignore
import humanfriendly # type: ignore
def pstr(path: bytes) -> str:
return path.decode(errors='backslashreplace')
def b2s(raw: bytes) -> str:
return raw.decode(errors='backslashreplace')
def format_size(n: int) -> str:
return humanfriendly.format_size(n, keep_width=True, binary=True)
# Type corresponding to patterns that are generated by
# wcmatch.translate: two lists of compiled REs (a,b). A path matches
# if it matches at least one regex in "a" and none in "b".
MatchPatterns = typing.Tuple[typing.List[re.Pattern], typing.List[re.Pattern]]
class Config:
roots: typing.List[bytes]
max_file_size: typing.Optional[int]
one_file_system: bool
exclude_caches: bool
exclude: typing.List[bytes]
force_include: typing.List[bytes]
exclude: MatchPatterns
unexclude: MatchPatterns
max_size_rules: typing.List[typing.Tuple[int, MatchPatterns]]
notify_email: typing.Optional[str]
def __init__(self, configfile: str):
# Read config
with open(configfile, 'r') as f:
config = yaml.safe_load(f)
self.one_file_system = config.get('one-file-system', False)
self.exclude_caches = config.get('exclude-caches', False)
if 'max-file-size' in config:
self.max_file_size = humanfriendly.parse_size(
config['max-file-size'])
else:
self.max_file_size = None
raw = config.get('roots', '').encode().split(b'\n')
self.roots = []
for x in raw:
if not len(x):
continue
self.roots.append(x)
self.roots.sort(key=len)
def process_match_list(config_name):
raw = config.get(config_name, '').encode().split(b'\n')
# Helper to process lists of patterns into regexes
def process_match_list(config_entry):
raw = config_entry.encode().split(b'\n')
pats = []
# Prepend '**/' to any relative patterns
for x in raw:
@@ -70,39 +57,49 @@ class Config:
pats.append(x)
else:
pats.append(b'**/' + x)
return pats
self.exclude = process_match_list('exclude')
self.force_include = process_match_list('force-include')
# Compile patterns.
(a, b) = wcmatch.glob.translate(
pats, flags=(wcmatch.glob.GLOBSTAR |
wcmatch.glob.DOTGLOB |
wcmatch.glob.NODOTDIR |
wcmatch.glob.EXTGLOB |
wcmatch.glob.BRACE))
return ([ re.compile(x) for x in a ],
[ re.compile(x) for x in b ])
# Read config
with open(configfile, 'r') as f:
config = yaml.safe_load(f)
self.one_file_system = config.get('one-file-system', False)
self.exclude_caches = config.get('exclude-caches', False)
raw = config.get('roots', '').encode().split(b'\n')
self.roots = []
for x in raw:
if not len(x):
continue
self.roots.append(x)
self.roots.sort(key=len)
self.exclude = process_match_list(config.get('exclude', ''))
self.unexclude = process_match_list(config.get('unexclude', ''))
self.max_size_rules = []
rules = { humanfriendly.parse_size(k): v
for k, v in config.get('max-size-rules', {}).items() }
for size in reversed(sorted(rules)):
self.max_size_rules.append(
(size, process_match_list(rules[size])))
self.notify_email = config.get('notify-email', None)
# Compile patterns
flags = (wcmatch.glob.GLOBSTAR |
wcmatch.glob.DOTGLOB |
wcmatch.glob.NODOTDIR |
wcmatch.glob.EXTGLOB |
wcmatch.glob.BRACE)
# Path matches if it matches at least one regex in "a" and no
# regex in "b"
(a, b) = wcmatch.glob.translate(self.exclude, flags=flags)
self.exclude_re = ([ re.compile(x) for x in a ],
[ re.compile(x) for x in b ])
(a, b) = wcmatch.glob.translate(self.force_include, flags=flags)
self.force_include_re = ([ re.compile(x) for x in a ],
[ re.compile(x) for x in b ])
def match_re(self,
re: typing.Tuple[typing.List[typing.Pattern],
typing.List[typing.Pattern]],
path: bytes):
def match_re(self, r: MatchPatterns, path: bytes):
# Path matches if it matches at least one regex in
# re[0] and no regex in re[1].
for a in re[0]:
# r[0] and no regex in r[1].
for a in r[0]:
if a.match(path):
for b in re[1]:
for b in r[1]:
if b.match(path):
return False
return True
@@ -114,24 +111,31 @@ class Backup:
self.dry_run = dry_run
self.root_seen: typing.Dict[bytes, bool] = {}
# Saved log messages
# Saved log messages (which includes borg output)
self.logs: typing.List[typing.Tuple[str, str]] = []
def out(self, path: bytes):
self.outfile.write(path + (b'\n' if self.dry_run else b'\0'))
def log(self, letter: str, msg: str, bold: bool=False):
colors = { 'E': 31, 'W': 33, 'I': 36 };
colors = {
'E': 31, # red: error
'W': 33, # yellow: warning
'N': 34, # blue: notice, a weaker warning (no email generated)
'I': 36, # cyan: info, backup.py script output
'O': 37, # white: regular output from borg
};
c = colors[letter] if letter in colors else 0
b = "" if bold else "\033[22m"
sys.stderr.write(f"\033[1;{c}m{letter}:{b} {msg}\033[0m\n")
sys.stdout.write(f"\033[1;{c}m{letter}:{b} {msg}\033[0m\n")
sys.stdout.flush()
self.logs.append((letter, msg))
def run(self, outfile: typing.IO[bytes]):
self.outfile = outfile
for root in self.config.roots:
if root in self.root_seen:
self.log('I', f"ignoring root, already seen: {pstr(root)}")
self.log('I', f"ignoring root, already seen: {b2s(root)}")
continue
try:
@@ -139,13 +143,13 @@ class Backup:
if not stat.S_ISDIR(st.st_mode):
raise NotADirectoryError
except FileNotFoundError:
self.log('E', f"root does not exist: {pstr(root)}")
self.log('E', f"root does not exist: {b2s(root)}")
continue
except NotADirectoryError:
self.log('E', f"root is not a directory: {pstr(root)}")
self.log('E', f"root is not a directory: {b2s(root)}")
continue
self.log('I', f"processing root {pstr(root)}")
self.log('I', f"processing root {b2s(root)}")
self.scan(root)
def scan(self, path: bytes, parent_st: os.stat_result=None):
@@ -166,7 +170,7 @@ class Backup:
# See if there's a reason to exclude it
exclude_reason = None
if self.config.match_re(self.config.exclude_re, decorated_path):
if self.config.match_re(self.config.exclude, decorated_path):
# Config file says to exclude
exclude_reason = ('I', f"skipping, excluded by config file")
@@ -178,20 +182,27 @@ class Backup:
exclude_reason = ('I', "skipping, on different filesystem")
elif (is_reg
and self.config.max_file_size
and size > self.config.max_file_size):
# Too big
a = format_size(size)
b = format_size(self.config.max_file_size)
exclude_reason = ('W', f"file size {a} exceeds limit {b}")
and len(self.config.max_size_rules)
and size > self.config.max_size_rules[-1][0]):
# Check file sizes against our list.
# Only need to check if the size is bigger than the smallest
# entry on the list; then, we need to check it against all rules
# to see which one applies.
for (max_size, patterns) in self.config.max_size_rules:
if self.config.match_re(patterns, decorated_path):
if size > max_size:
a = format_size(size)
b = format_size(max_size)
exclude_reason = (
'W', f"file size {a} exceeds limit {b}")
break
# If we have a reason to exclude it, stop now unless it's
# force-included
force = self.config.match_re(self.config.force_include_re,
decorated_path)
force = self.config.match_re(self.config.unexclude, decorated_path)
if exclude_reason and not force:
self.log(exclude_reason[0],
f"{exclude_reason[1]}: {pstr(path)}")
f"{exclude_reason[1]}: {b2s(path)}")
return
# Print path for Borg
@@ -213,7 +224,7 @@ class Backup:
with open(path + b'/CACHEDIR.TAG', 'rb') as f:
if f.read(len(tag)) == tag:
self.log(
'I', f"skipping, cache dir: {pstr(path)}")
'I', f"skipping, cache dir: {b2s(path)}")
return
except:
pass
@@ -227,9 +238,125 @@ class Backup:
IsADirectoryError,
NotADirectoryError,
PermissionError) as e:
self.log('E', f"can't read {pstr(path)}: {str(e)}")
self.log('E', f"can't read {b2s(path)}: {str(e)}")
return
def run_borg(self, argv: typing.List[str],
stdin_writer: typing.Callable[[typing.IO[bytes]],
typing.Any]=None):
"""Run a borg command, capturing and displaying output, while feeding
input using stdin_writer. Returns True on Borg success, False on error.
"""
borg = subprocess.Popen(argv,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
if borg.stdin is None:
raise Exception("no pipe")
# Count warnings and errors from Borg, so we can interpret its
# error codes correctly (e.g. ignoring exit codes if warnings
# were all harmless).
borg_saw_warnings = 0
borg_saw_errors = 0
# Use a thread to capture output
def reader_thread(fh):
nonlocal borg_saw_warnings
nonlocal borg_saw_errors
last_progress = 0
for line in fh:
try:
data = json.loads(line)
if data['type'] == 'log_message':
changed_msg = "file changed while we backed it up"
if data['levelname'] == 'WARNING':
if changed_msg in data['message']:
# harmless; don't count as a Borg warning
outlevel = 'N'
else:
borg_saw_warnings += 1
outlevel = 'W'
output = "warning: "
elif data['levelname'] not in ('DEBUG', 'INFO'):
borg_saw_errors += 1
outlevel = 'E'
output = "error: "
else:
outlevel = 'O'
output = ""
output += data['message']
elif (data['type'] == 'progress_message'
and 'message' in data):
outlevel = 'O'
output = data['message']
elif data['type'] == 'archive_progress':
now = time.time()
if now - last_progress > 10:
last_progress = now
def size(short: str, full: str) -> str:
return f" {short}={format_size(data[full])}"
outlevel = 'O'
output = (f"progress:" +
f" files={data['nfiles']}" +
size('orig', 'original_size') +
size('comp', 'compressed_size') +
size('dedup', 'deduplicated_size'))
else:
continue
else:
# ignore unknown progress line
continue
except Exception as e:
# on error, print raw line with exception
outlevel = 'E'
output = f"[exception: {str(e)}] " + b2s(line).rstrip()
self.log(outlevel, output)
fh.close()
def _reader_thread(fh):
try:
return reader_thread(fh)
except BrokenPipeError:
pass
except Exception:
_thread.interrupt_main()
reader = threading.Thread(target=_reader_thread, args=(borg.stdout,))
reader.daemon = True
reader.start()
try:
if stdin_writer:
# Give borg some time to start, just to clean up stdout
time.sleep(1)
stdin_writer(borg.stdin)
except BrokenPipeError:
self.log('E', "<broken pipe>")
finally:
try:
borg.stdin.close()
except BrokenPipeError:
pass
borg.wait()
reader.join()
ret = borg.returncode
if ret < 0:
self.log('E', f"borg exited with signal {-ret}")
elif ret == 2 or borg_saw_errors:
self.log('E', f"borg exited with errors (ret={ret})")
elif ret == 1:
if borg_saw_warnings:
self.log('W', f"borg exited with warnings (ret={ret})")
else:
return True
elif ret != 0:
self.log('E', f"borg exited with unknown error code {ret}")
else:
return True
return False
def main(argv: typing.List[str]):
import argparse
@@ -278,8 +405,6 @@ def main(argv: typing.List[str]):
backup.log('W', f"failed to parse variables from {args.vars}: {str(e)}")
# Run backup
captured_output: typing.List[bytes] = []
if args.dry_run:
if args.debug:
backup.run(sys.stdout.buffer)
@@ -288,112 +413,37 @@ def main(argv: typing.List[str]):
backup.run(out)
sys.stdout.flush()
else:
borg = subprocess.Popen([borg_sh,
"create",
"--verbose",
"--progress",
"--log-json",
"--list",
"--filter", "E",
"--stats",
"--checkpoint-interval", "900",
"--compression", "zstd,3",
"--paths-from-stdin",
"--paths-delimiter", "\\0",
"::" + hostname + "-{now:%Y%m%d-%H%M%S}"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
if borg.stdin is None:
raise Exception("no pipe")
if backup.run_borg([borg_sh,
"create",
"--verbose",
"--progress",
"--log-json",
"--list",
"--filter", "E",
"--stats",
"--checkpoint-interval", "900",
"--compression", "zstd,3",
"--paths-from-stdin",
"--paths-delimiter", "\\0",
"::" + hostname + "-{now:%Y%m%d-%H%M%S}"],
stdin_writer=backup.run):
borg_saw_warnings = 0
borg_saw_errors = 0
# Use a thread to capture output
def reader_thread(fh):
nonlocal borg_saw_warnings
nonlocal borg_saw_errors
last_progress = 0
for line in fh:
try:
data = json.loads(line)
if data['type'] == 'log_message':
# Count warnings and errors, but ignore some.
changed_msg = "file changed while we backed it up"
if data['levelname'] == 'WARNING':
prefix = "warning: "
if changed_msg not in data['message']:
borg_saw_warnings += 1
elif data['levelname'] not in ('DEBUG', 'INFO'):
prefix = "error: "
borg_saw_errors += 1
else:
prefix = ""
line = (prefix + data['message'] + '\n').encode()
elif (data['type'] == 'progress_message'
and 'message' in data):
line = (data['message'] + '\n').encode()
elif data['type'] == 'archive_progress':
now = time.time()
if now - last_progress > 10:
last_progress = now
def size(short: str, full: str) -> str:
return f" {short}={format_size(data[full])}"
line = (f"progress:" +
f" files={data['nfiles']}" +
size('orig', 'original_size') +
size('comp', 'compressed_size') +
size('dedup', 'deduplicated_size') +
"\n").encode()
else:
continue
else:
# ignore unknown progress line
continue
except Exception as e:
# on error, print raw line with exception
line = f"[exception: {str(e)} ]".encode() + line
sys.stdout.buffer.write(line)
sys.stdout.flush()
captured_output.append(line)
fh.close()
def _reader_thread(fh):
try:
return reader_thread(fh)
except BrokenPipeError:
pass
except Exception:
_thread.interrupt_main()
reader = threading.Thread(target=_reader_thread, args=(borg.stdout,))
reader.daemon = True
reader.start()
try:
# Give borg some time to start, just to clean up stdout
time.sleep(1)
backup.run(borg.stdin)
except BrokenPipeError:
sys.stderr.write(f"broken pipe\n")
finally:
try:
borg.stdin.close()
except BrokenPipeError:
pass
borg.wait()
reader.join()
ret = borg.returncode
if ret < 0:
backup.log('E', f"borg exited with signal {-ret}")
elif ret == 2 or borg_saw_errors:
backup.log('E', f"borg exited with errors (ret={ret})")
elif ret == 1 and borg_saw_warnings:
backup.log('W', f"borg exited with warnings (ret={ret})")
elif ret != 0:
backup.log('E', f"borg exited with unknown error code {ret}")
# backup success; run prune. Note that this won't actually free
# space until a "./borg.sh --rw compact", because we're in
# append-only mode.
backup.log('I', f"pruning archives", bold=True)
backup.run_borg([borg_sh,
"prune",
"--verbose",
"--list",
"--progress",
"--log-json",
"--stats",
"--keep-within=7d",
"--keep-daily=14",
"--keep-weekly=8",
"--keep-monthly=-1",
"--glob-archives", hostname + "-????????-??????"])
# See if we had any errors
warnings = sum(1 for (letter, msg) in backup.logs if letter == 'W')
@@ -420,19 +470,24 @@ def main(argv: typing.List[str]):
if email and not args.dry_run:
backup.log('I', f"sending error notification to {email}")
# Show all of our warnings and errors. Use a ">" prefix
# so warnings and errors get highlighted by the mail reader.
body = [ "Logs from backup.py:" ]
for (letter, msg) in backup.logs:
if letter == "E" or letter == "W":
prefix = ">"
else:
prefix = " "
body.append(f"{prefix}{letter}: {msg}")
body_text = "\n".join(body).encode()
def write_logs(title, only_include=None):
body = [ title ]
for (letter, msg) in backup.logs:
if only_include and letter not in only_include:
continue
# Use a ":" prefix for warnings/errors/notices so that
# the mail reader highlights them.
if letter in "EWN":
prefix = ":"
else:
prefix = " "
body.append(f"{prefix}{letter}: {msg}")
return "\n".join(body).encode()
# Followed by borg output
body_text += b"\n\nBorg output:\n" + b"".join(captured_output)
body_text = write_logs("Logged errors and warnings:", "EWN")
body_text += b"\n\n"
body_text += write_logs("All log messages:")
# Subject summary
if errmsg and warnmsg:

BIN
bin/borg.armv7l Executable file
View File

Binary file not shown.

View File

Binary file not shown.

11
borg.sh
View File

@@ -7,15 +7,6 @@ export BORG_PASSCOMMAND="cat ${BORG_DIR}/passphrase"
export BORG_BASE_DIR=${BORG_DIR}
export BORG_CACHE_DIR=${BORG_DIR}/cache
export BORG_CONFIG_DIR=${BORG_DIR}/config
if [ "$1" = "--rw" ] ; then
if [ "$BORG_RW_KEY_ADDED" != "1" ] ; then
echo "=== Need SSH key passphrase. Check Bitwarden for:"
echo "=== borg $HOSTNAME / read-write SSH key"
fi
export BORG_RSH="ssh -F $SSH/config -o BatchMode=no -o PreferredAuthentications=publickey -i $SSH/id_ecdsa"
shift
else
export BORG_RSH="ssh -F $SSH/config -i $SSH/id_ecdsa_appendonly"
fi
export BORG_RSH="ssh -F $SSH/config -i $SSH/id_ecdsa_appendonly"
exec "${BORG_BIN}" "$@"

View File

@@ -10,12 +10,6 @@ roots: |
one-file-system: true
exclude-caches: true
# Files larger than this are excluded. If a large file isn't
# explicitly mentioned in "excludes" below, it also generates a
# warning. Note that this counts used blocks, so files with large
# holes will still be considered small (since they'll compress easily)
max-file-size: 500MiB
# Files/dirs to exclude from backup.
# Relative paths are treated as if starting with **/
# Paths ending in / will only match directories.
@@ -27,10 +21,21 @@ exclude: |
Steam/ubuntu*/
.cache/
# Rules to exclude files based on file size.
# This is a dict of sizes, each with a list of rules.
# For a given path, the largest size with a matching rule applies.
# Matching follows the same behavior as the "exclude" list.
# Size is calculated as used blocks (think "du", not "du --apparent-size").
max-size-rules:
500 MiB: |
*
# 1.0 GiB: |
# *.mp4
# Files that are always included, even if they would have been
# excluded due to file size or the "exclude" list.
# Matching rules are the same as above.
force-include: |
# Matching follows the same behavior as the "exclude" list.
unexclude: |
.git/objects/pack/*.pack
# Email address for notification at end of backup

View File

@@ -1,21 +1,40 @@
#!/bin/bash
# These can be overridden when running this script
HOSTNAME=${HOSTNAME:-$(hostname)}
BACKUP_HOST=${BACKUP_HOST:-backup.jim.sh}
BACKUP_USER=${BACKUP_USER:-jim-backups}
BACKUP_REPO=${BACKUP_REPO:-borg/${HOSTNAME}}
set_default_variables()
{
HOSTNAME=${HOSTNAME:-$(hostname)}
BACKUP_HOST=${BACKUP_HOST:-backup.jim.sh}
BACKUP_PORT=${BACKUP_PORT:-222}
BACKUP_USER=${BACKUP_USER:-jim-backups}
BACKUP_REPO=${BACKUP_REPO:-borg/${HOSTNAME}}
SYSTEMD_UNIT=${SYSTEMD_UNIT:-borg-backup}
# Use stable host ID in case MAC address changes.
# Note that this host ID is only used to manage locks, so it's
# not crucial that it remains stable.
UUID=$(python3 -c 'import uuid;print(uuid.getnode())')
HOSTID=${BORG_HOST_ID:-"${HOSTNAME}@${UUID}"}
log "Configuration:"
log " HOSTNAME Local hostname: \033[1m${HOSTNAME}"
log " HOSTID Local host ID: \033[1m${HOSTID}"
log " BACKUP_USER Backup server user: \033[1m${BACKUP_USER}"
log " BACKUP_HOST Backup server host: \033[1m${BACKUP_HOST}"
log " BACKUP_PORT Backup server port: \033[1m${BACKUP_PORT}"
log " BACKUP_REPO Server repository: \033[1m${BACKUP_REPO}"
log " SYSTEMD_UNIT Systemd unit name: \033[1m${SYSTEMD_UNIT}"
for i in $(seq 15 -1 1); do
printf "\rPress ENTER or wait $i seconds to continue... \b"
if read -t 1 ; then break; fi
done
}
# Main dir is where this repo was checked out
BORG_DIR="$(realpath "$(dirname "$0")")"
cd "${BORG_DIR}"
# This is named with uppercase so that it doesn't tab-complete for
# "./b<tab>", which should give us "./borg.sh"
BORG_BIN="${BORG_DIR}/Borg.bin"
# Use stable host ID in case MAC address changes
HOSTID="${HOSTNAME}@$(python -c 'import uuid;print(uuid.getnode())')"
BORG_BIN="${BORG_DIR}/bin/borg.$(uname -m)"
function error_handler() {
echo "Error at $1 line $2:"
@@ -27,11 +46,100 @@ trap 'error_handler ${BASH_SOURCE} ${LINENO} $?' ERR
set -o errexit
set -o errtrace
if [ -e ".setup-complete" ]; then
echo "Error: BORG_DIR $BORG_DIR was already set up; giving up."
echo "Use \"git clean\" to return it to original state if desired"
exit 1
fi
# Create pip environment
setup_venv()
{
if ! which pipenv >/dev/null 2>&1 ; then
error "pipenv not found, try: sudo apt install pipenv"
fi
mkdir -p .venv
pipenv install
}
# Create shell script with environment variables
create_borg_vars()
{
VARS=${BORG_DIR}/vars.sh
# These variables are used elsewhere in this script
BORG_REPO="ssh://${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}/./${BACKUP_REPO}"
BORG=${BORG_DIR}/borg.sh
SSH=$BORG_DIR/ssh
cat >"$VARS" <<EOF
export BACKUP_USER=${BACKUP_USER}
export BACKUP_HOST=${BACKUP_HOST}
export BACKUP_PORT=${BACKUP_PORT}
export BACKUP_REPO=${BACKUP_REPO}
export HOSTNAME=${HOSTNAME}
export BORG_REPO=${BORG_REPO}
export BORG_HOST_ID=${HOSTID}
export BORG_PASSCOMMAND="cat ${BORG_DIR}/passphrase"
export BORG_DIR=${BORG_DIR}
export SSH=${SSH}
export BORG=${BORG}
export BORG_BIN=${BORG_BIN}
export SYSTEMD_UNIT=${SYSTEMD_UNIT}
EOF
if ! "$BORG" -h >/dev/null ; then
error "Can't run the borg wrapper; does borg work?"
fi
}
# Copy templated files, filling in templates as needed
install_templated_files()
{
DOCS="README.md"
SCRIPTS="notify.sh logs.sh"
for i in ${DOCS} ${SCRIPTS}; do
sed -e "s!\${HOSTNAME}!${HOSTNAME}!g" \
-e "s!\${BORG_DIR}!${BORG_DIR}!g" \
-e "s!\${BORG_BIN}!${BORG_BIN}!g" \
-e "s!\${BACKUP_USER}!${BACKUP_USER}!g" \
-e "s!\${BACKUP_HOST}!${BACKUP_HOST}!g" \
-e "s!\${BACKUP_PORT}!${BACKUP_PORT}!g" \
-e "s!\${BACKUP_REPO}!${BACKUP_REPO}!g" \
-e "s!\${SYSTEMD_UNIT}!${SYSTEMD_UNIT}!g" \
templates/$i > $i
done
chmod +x ${SCRIPTS}
}
# Update local paths in scripts
update_paths()
{
sed -i\
-e "1c#!${BORG_DIR}/.venv/bin/python" \
backup.py
}
# See if we're just supposed to update an existing install, or recovering
parse_args()
{
RECOVER=0
UPDATE=0
if [ "$1" == "--recover" ] ; then
if [ -e "vars.sh" ]; then
error "It looks like this borg was already set up, can only recover from fresh start"
fi
RECOVER=1
elif [ "$1" == "--update-paths" ] || [ "$1" == "--update" ] ; then
if [ ! -e "vars.sh" ]; then
error "Can't update, not set up yet"
fi
UPDATE=1
elif [ -n "$1" ] ; then
error "Unknown arg $1"
elif [ -e "vars.sh" ]; then
warn "Error: BORG_DIR $BORG_DIR already looks set up."
warn "Use \"git clean\" to return it to original state if desired".
warn "Or specify --update to refresh things from latest git."
error "Giving up"
fi
}
# Make a temp dir to work in
TMP=$(mktemp -d)
@@ -66,46 +174,6 @@ notice() { msg 32 "$@" ; }
warn() { msg 31 "$@" ; }
error() { msg 31 "Error:" "$@" ; exit 1 ; }
# Create pip environment
setup_venv()
{
if ! which pipenv >/dev/null 2>&1 ; then
echo "pipenv not found, try: sudo apt install pipenv"
exit 1
fi
mkdir .venv
pipenv install
}
# Create shell script with environment variables
create_borg_vars()
{
VARS=${BORG_DIR}/vars.sh
# These variables are used elsewhere in this script
BORG_REPO="ssh://${BACKUP_USER}@${BACKUP_HOST}/./${BACKUP_REPO}"
BORG=${BORG_DIR}/borg.sh
SSH=$BORG_DIR/ssh
cat >"$VARS" <<EOF
export BACKUP_USER=${BACKUP_USER}
export BACKUP_HOST=${BACKUP_HOST}
export BACKUP_REPO=${BACKUP_REPO}
export HOSTNAME=${HOSTNAME}
export BORG_REPO=${BORG_REPO}
export BORG_HOST_ID=${HOSTID}
export BORG_PASSCOMMAND="cat ${BORG_DIR}/passphrase"
export BORG_DIR=${BORG_DIR}
export SSH=${SSH}
export BORG=${BORG}
export BORG_BIN=${BORG_BIN}
EOF
if ! "$BORG" -h >/dev/null ; then
error "Can't run the borg wrapper; does borg work?"
fi
}
print_random_key()
{
dd if=/dev/urandom | tr -dc 'a-zA-Z0-9' | head -c 16
@@ -113,8 +181,18 @@ print_random_key()
generate_keys()
{
PASS_SSH=$(print_random_key)
PASS_REPOKEY=$(print_random_key)
if [ $RECOVER -eq 1 ] ; then
notice "Recovering configuration in order to use an existing backup"
read -s -p "Repo key for \"borg ${HOSTNAME}\": " PASS_REPOKEY
echo
read -s -p "Again: " PASS_REPOKEY2
echo
if [ -z "$PASS_REPOKEY" ] || [ $PASS_REPOKEY != $PASS_REPOKEY2 ] ; then
error "Bad repo key"
fi
else
PASS_REPOKEY=$(print_random_key)
fi
echo "$PASS_REPOKEY" > passphrase
chmod 600 passphrase
}
@@ -136,8 +214,6 @@ configure_ssh()
-C "backup-appendonly@$HOSTID" -f "$SSH/id_ecdsa_appendonly"
ssh-keygen -N "" -t ecdsa \
-C "backup-notify@$HOSTID" -f "$SSH/id_ecdsa_notify"
ssh-keygen -N "$PASS_SSH" -t ecdsa \
-C "backup@$HOSTID" -f "$SSH/id_ecdsa"
# Create config snippets
log "Creating SSH config and wrapper script"
@@ -159,15 +235,18 @@ EOF
ssh -F "$SSH/config" -o BatchMode=no -o PubkeyAuthentication=no \
-o ControlMaster=yes -o ControlPath="$TMP/ssh-control" \
-o StrictHostKeyChecking=accept-new \
-p "${BACKUP_PORT}" \
-f "${BACKUP_USER}@${BACKUP_HOST}" sleep 600
if ! run_ssh_command true >/dev/null 2>&1 </dev/null ; then
error "SSH failed"
fi
log "Connected to ${BACKUP_USER}@${BACKUP_HOST}"
log "Connected to ${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}"
# Since we now have an SSH connection, check that the repo doesn't exist
if run_ssh_command "test -e $BACKUP_REPO" ; then
# Since we now have an SSH connection, check repo existence
if [ $RECOVER -eq 0 ] && run_ssh_command "test -e $BACKUP_REPO"; then
error "$BACKUP_REPO already exists on the server, bailing out"
elif [ $RECOVER -ne 0 ] && ! run_ssh_command "test -e $BACKUP_REPO"; then
error "$BACKUP_REPO does NOT exist on the server, can't recover backup config"
fi
# Copy SSH keys to the server's authorized_keys file, removing any
@@ -184,13 +263,13 @@ EOF
run_ssh_command "cat >> .ssh/authorized_keys" <<EOF
command="$cmd --append-only",restrict $(cat "$SSH/id_ecdsa_appendonly.pub")
command="borg/notify.sh",restrict $(cat "$SSH/id_ecdsa_notify.pub")
command="$cmd",restrict $(cat "$SSH/id_ecdsa.pub")
EOF
# Test that everything worked
log "Testing SSH login with new key"
if ! ssh -F "$SSH/config" -i "$SSH/id_ecdsa_appendonly" -T \
"${BACKUP_USER}@${BACKUP_HOST}" "$REMOTE_BORG" --version </dev/null ; then
-p "${BACKUP_PORT}" "${BACKUP_USER}@${BACKUP_HOST}" "$REMOTE_BORG" \
--version </dev/null ; then
error "Logging in with a key failed -- is server set up correctly?"
fi
log "Remote connection OK!"
@@ -219,8 +298,8 @@ EOF
configure_systemd()
{
TIMER=borg-backup.timer
SERVICE=borg-backup.service
TIMER=${SYSTEMD_UNIT}.timer
SERVICE=${SYSTEMD_UNIT}.service
TIMER_UNIT=${BORG_DIR}/${TIMER}
SERVICE_UNIT=${BORG_DIR}/${SERVICE}
@@ -256,89 +335,117 @@ ExecStart=${BORG_DIR}/backup.py
Nice=10
IOSchedulingClass=best-effort
IOSchedulingPriority=6
Restart=on-failure
RestartSec=600
EOF
log "Setting up systemd"
if (
ln -sfv "${TIMER_UNIT}" /etc/systemd/system &&
ln -sfv "${SERVICE_UNIT}" /etc/systemd/system &&
systemctl --no-ask-password daemon-reload &&
systemctl --no-ask-password enable ${TIMER} &&
systemctl --no-ask-password start ${TIMER}
); then
log "Backup timer installed:"
systemctl list-timers --no-pager ${TIMER}
else
warn ""
warn "Systemd setup failed"
warn "Do something like this to configure automatic backups:"
echo " sudo ln -sfv \"${TIMER_UNIT}\" /etc/systemd/system &&"
echo " sudo ln -sfv \"${SERVICE_UNIT}\" /etc/systemd/system &&"
echo " sudo systemctl daemon-reload &&"
echo " sudo systemctl enable ${TIMER} &&"
if [ $RECOVER -eq 1 ] ; then
log "Partially setting up systemd"
ln -sfv "${TIMER_UNIT}" /etc/systemd/system
ln -sfv "${SERVICE_UNIT}" /etc/systemd/system
systemctl --no-ask-password daemon-reload
systemctl --no-ask-password stop ${TIMER}
systemctl --no-ask-password disable ${TIMER}
warn "Since we're recovering, systemd automatic backups aren't enabled"
warn "Do something like this to configure automatic backups:"
echo " sudo systemctl enable ${TIMER} &&"
echo " sudo systemctl start ${TIMER}"
warn ""
else
log "Setting up systemd"
if (
ln -sfv "${TIMER_UNIT}" /etc/systemd/system &&
ln -sfv "${SERVICE_UNIT}" /etc/systemd/system &&
systemctl --no-ask-password daemon-reload &&
systemctl --no-ask-password enable ${TIMER} &&
systemctl --no-ask-password start ${TIMER}
); then
log "Backup timer installed:"
systemctl list-timers --no-pager ${TIMER}
else
warn ""
warn "Systemd setup failed"
warn "Do something like this to configure automatic backups:"
echo " sudo ln -sfv \"${TIMER_UNIT}\" /etc/systemd/system &&"
echo " sudo ln -sfv \"${SERVICE_UNIT}\" /etc/systemd/system &&"
echo " sudo systemctl daemon-reload &&"
echo " sudo systemctl enable ${TIMER} &&"
echo " sudo systemctl start ${TIMER}"
warn ""
fi
fi
}
update_paths()
{
sed -i \
-e "s!\${HOSTNAME}!${HOSTNAME}!g" \
-e "s!\${BORG_DIR}!${BORG_DIR}!g" \
-e "s!\${BACKUP_USER}!${BACKUP_USER}!g" \
-e "s!\${BACKUP_HOST}!${BACKUP_HOST}!g" \
-e "s!\${BACKUP_REPO}!${BACKUP_REPO}!g" \
README.md
sed -i\
-e "1c#!${BORG_DIR}/.venv/bin/python" \
backup.py
}
git_setup()
{
if ! git checkout -b "setup-${HOSTNAME}" ; then
log "Committing local changes to git"
if ! git checkout -b "setup-${HOSTNAME}" ||
! git add ${SYSTEMD_UNIT}.service ${SYSTEMD_UNIT}.timer vars.sh ||
! git commit -a -m "autocommit after initial setup on ${HOSTNAME}" ; then
warn "Git setup failed; ignoring"
return
fi
log "Committing local changes to git"
git add README.md borg-backup.service borg-backup.timer vars.sh
git commit -a -m "autocommit after initial setup on ${HOSTNAME}"
}
log "Configuration:"
log " Backup server host: ${BACKUP_HOST}"
log " Backup server user: ${BACKUP_USER}"
log " Repository path: ${BACKUP_REPO}"
main() {
if [ $UPDATE -eq 1 ] ; then
notice "Non-destructively updating paths, variables, and venv..."
source vars.sh
set_default_variables
install_templated_files
update_paths
setup_venv
create_borg_vars
setup_venv
create_borg_vars
generate_keys
configure_ssh
create_repo
export_keys
configure_systemd
update_paths
git_setup
notice "Testing SSH"
ssh -F "$SSH/config" -i "$SSH/id_ecdsa_appendonly" \
-o BatchMode=no -o StrictHostKeyChecking=ask \
-p "${BACKUP_PORT}" "${BACKUP_USER}@${BACKUP_HOST}" info >/dev/null
echo
notice "Add these two passwords to Bitwarden:"
notice ""
notice " Name: borg ${HOSTNAME}"
notice " Username: read-write ssh key"
notice " Password: $PASS_SSH"
notice ""
notice " Name: borg ${HOSTNAME}"
notice " Username: repo key"
notice " Password: $PASS_REPOKEY"
notice ""
notice "Test the backup file list with"
notice " sudo ${BORG_DIR}/backup.py --dry-run"
notice "and make any necessary adjustments to:"
notice " ${BORG_DIR}/config.yaml"
notice "Testing borg: if host location changed, say 'y' here"
${BORG_DIR}/borg.sh info
echo
notice "Done -- check 'git diff' and verify changes."
exit 0
fi
echo "All done"
set_default_variables
setup_venv
create_borg_vars
generate_keys
configure_ssh
[ $RECOVER -eq 0 ] && create_repo
export_keys
configure_systemd
install_templated_files
update_paths
git_setup
echo
if [ $RECOVER -eq 1 ] ; then
notice "You should be set up with borg pointing to the existing repo now."
notice "Use commands like these to look at the backup:"
notice " sudo ${BORG_DIR}/borg.sh info"
notice " sudo ${BORG_DIR}/borg.sh list"
notice "You'll want to now restore files like ${BORG_DIR}/config.yaml before enabling systemd timers"
else
notice "Add this password to Bitwarden:"
notice ""
notice " Name: borg ${HOSTNAME}"
notice " Username: repo key"
notice " Password: $PASS_REPOKEY"
notice ""
notice "Test the backup file list with"
notice " sudo ${BORG_DIR}/backup.py --dry-run"
notice "and make any necessary adjustments to:"
notice " ${BORG_DIR}/config.yaml"
fi
echo
echo "All done"
}
parse_args "$@"
main

View File

@@ -1,26 +0,0 @@
#!/bin/bash
set -e
. "$(dirname "$0")"/vars.sh
if [ "$BORG_RW_KEY_ADDED" != "1" ] ; then
echo "Re-executing under a new ssh agent"
exec env BORG_RW_KEY_ADDED=1 ssh-agent "$0"
fi
echo "=== Please enter SSH key passphrase. Check Bitwarden for:"
echo "=== borg $HOSTNAME / read-write SSH key"
ssh-add "$(realpath "$(dirname "$0")")/ssh/id_ecdsa"
$BORG --rw prune \
--verbose \
--progress \
--stats \
--keep-within=7d \
--keep-daily=14 \
--keep-weekly=8 \
--keep-monthly=-1
$BORG --rw compact \
--verbose \
--progress

145
templates/README.md Normal file
View File

@@ -0,0 +1,145 @@
Initial setup
=============
Run on client:
sudo git clone https://git.jim.sh/jim/borg-setup.git /opt/borg
sudo /opt/borg/initial-setup.sh
Customize `/opt/borg/config.yaml` as desired.
Cheat sheet
===========
*After setup, /opt/borg/README.md will have the variables in this
section filled in automatically*
## Configuration
Hostname: ${HOSTNAME}
Base directory: ${BORG_DIR}
Destination: ${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}
Repository: ${BACKUP_REPO}
## Commands
See when next backup is scheduled:
systemctl list-timers ${SYSTEMD_UNIT}.timer
See log of most recent backup attempt:
${BORG_DIR}/logs.sh
${BORG_DIR}/logs.sh -f
Start backup now:
sudo systemctl restart ${SYSTEMD_UNIT}
Interrupt backup in progress:
sudo systemctl stop ${SYSTEMD_UNIT}
Show backups and related info:
sudo ${BORG_DIR}/borg.sh info
sudo ${BORG_DIR}/borg.sh list
Run Borg using the read-write SSH key:
sudo ${BORG_DIR}/borg.sh --rw list
Mount and look at files:
mkdir mnt
sudo ${BORG_DIR}/borg.sh mount :: mnt
sudo -s # to explore as root
sudo umount mnt
Update borg-setup from git:
cd /opt/borg
sudo git remote update
sudo git rebase origin/master
sudo ./inital-setup.sh --update
sudo git commit -m 'Update borg-setup'
## Compaction and remote access
Old backups are "pruned" automatically, but because the SSH key is
append-only, no space is actually recovered on the server, it's just
marked for deletion. If you are sure that the client system was not
compromised, then you can run compaction manually directly on the
backup host by logging in via SSH (bitwarden `ssh ${BACKUP_HOST} /
${BACKUP_USER}`) and compacting there:
ssh -p ${BACKUP_PORT} ${BACKUP_USER}@${BACKUP_HOST} borg/borg compact --verbose --progress ${BACKUP_REPO}
This doesn't require the repo key. That key shouldn't be entered on
the untrusted backup host, so for operations that need it, use a
trusted host and run borg remotely instead, e.g.:
${BORG_BIN} --remote-path borg/borg info ssh://${BACKUP_USER}@${BACKUP_HOST}:${BACKUP_PORT}/borg/${HOSTNAME}
The repo passphrase is in bitwarden `borg ${HOSTNAME} / repo key`.
Design
======
- On server, we have a separate user account "jim-backups". Password
for this account is in bitwarden in the "Backups" folder, under `ssh
backup.jim.sh`.
- Repository keys are repokeys, which get stored on the server, inside
the repo. Passphrases are stored:
- on clients (in `/opt/borg/passphrase`, for making backups)
- in bitwarden (under `borg <hostname>`, user `repo key`)
- Each client has two passwordless SSH keys for connecting to the server:
- `/opt/borg/ssh/id_ecdsa_appendonly`
- configured on server for append-only operation
- used for making backups
- `/opt/borg/ssh/id_ecdsa_notify`
- configured on server for running `borg/notify.sh` only
- used for sending email notifications on errors
- Systemd timers start daily backups:
/etc/systemd/system/borg-backup.service -> /opt/borg/borg-backup.service
/etc/systemd/system/borg-backup.timer -> /opt/borg/borg-backup.timer
- Backup script `/opt/borg/backup.py` uses configuration in
`/opt/borg/config.yaml` to generate our own list of files, excluding
anything that's too large by default. This requires borg 1.2 or newer.
Notes
=====
# Building Borg binary from git
sudo apt install python3.9 scons libacl1-dev libfuse-dev libpython3.9-dev patchelf
git clone https://github.com/borgbackup/borg.git
cd borg
virtualenv --python=python3.9 borg-env
source borg-env/bin/activate
pip install -r requirements.d/development.txt
pip install pyinstaller
pip install llfuse
pip install -e .[llfuse]
pyinstaller --clean --noconfirm scripts/borg.exe.spec
pip install staticx
# for x86
staticx -l /lib/x86_64-linux-gnu/libm.so.6 dist/borg.exe borg.x86_64
# for ARM; see https://github.com/JonathonReinhart/staticx/issues/209
staticx -l /lib/arm-linux-gnueabihf/libm.so.6 dist/borg.exe borg.armv7l
Then run `borg.x86_64`. Confirm the version with `borg.armv7l --version`.
*Note:* This uses the deprecated `llfuse` instead of the newer `pyfuse3`.
`pyfuse3` doesn't work because, at minimum, it pulls in `trio` which
requires `ssl` which is explicitly excluded by `scripts/borg.exe.spec`.

3
templates/logs.sh Normal file
View File

@@ -0,0 +1,3 @@
#!/bin/bash
exec journalctl --all --unit ${SYSTEMD_UNIT} --since "$(systemctl show -P ExecMainStartTimestamp ${SYSTEMD_UNIT})" "$@"

1
notify.sh → templates/notify.sh Executable file → Normal file
View File

@@ -22,5 +22,6 @@ EMAIL="$2"
ssh \
-F "$SSH/config" \
-i "$SSH/id_ecdsa_notify" \
-p "$BACKUP_PORT" \
"$BACKUP_USER@$BACKUP_HOST" \
borg/notify.sh "$EMAIL"