365 Days of Code - Day 046

Project Status

ProjectLanguageStatusDue DateLatest Update
Personal WebsiteHugoOngoingNoneThe site is live. There are some TODOs. Need to work on categorization, tagging, and layout improvements.
Laravel From ScratchLaravel (PHP)In-Progress2026-03-31Episode 8
PRMLaravel (PHP)In-Progress2026-03-31Working alongside other Laravel projects.
Client Website (J.L.)Laravel (PHP)In-Progress2026-03-31Working alongside other Laravel projects.
Project EulerCOngoingNoneWorking on P25. BigInt (AI gen) was a waste of time, need to rewrite
Practice JavaJavaPausedNoneInstalled, need to find a good project.
Practice PythonPythonPausedNoneInstalled, need to find a good project.
Learn GoGoPausedNoneInstalled, work on LDAP Injector from ippsec.
Learn RustRustHaven’t StartedNoneInstalled, will try network protocols after finishing in C and Zig.
Learn ElixirElixirHaven’t StartedNoneInstalled, need a good tutorial project.
Learn HaskellHaskellHaven’t StartedNoneInstalled, need a good tutorial project.
Learn ZigZigHaven’t StartedNoneInstalled, will try network protocols after finishing in C.
Linux+N/AIn-Progress2026-03-31Reading Chapter 4.
Cyber Quest 2026N/AIn-Progress2026-02-28Finished quiz 1 with 75%.
Operating SystemsN/AIn-Progress2026-03-31Reading Chapter 4: Abstraction
Grey-Hat HackingVariousIn-Progress2026-03-31Reading Chapter 8: Threat Hunting Lab
PHP Time TrackerPHPBeta FinishedNoneWorking on a basic level. Could use a couple more updates to make it fully functional.
HTTP Status Code ReaderCComplete2026-02-18Complete. Could potentially upgrade for more advanced functions or follow redirects.
ZSH Configurationbash/zshCompleteNoneSort of an ongoing process, but complete for now. Works good.
Network ProtocolsCIn-ProgressNoneV2 complete. Moving to V3, refactoring again.
Discinox WebsiteHTML, CSS, JSComplete2026-03-04The site is live.
DiroffTech WebsiteHTML, CSS, JSComplete2026-03-05The site is live. git-lfs needs to be initialized for images.
Automate BackupsbashIn-Progress2026-03-08Source backups done. Need to poll and copy for backups to archive server.

Backup Day

Today is backup day. It has been several weeks since I last updated my Gitea server, and I have yet to perform a webserver backup since migrating to my new VPS. Given my lessons (hopefully) learned from my AWS spot instance issue, backups are very overdue.

Fortunately, this is a simple matter.

bash
tar -czf backup_date_time.tar.gz /home/user/backups
1
tar -czf backup_date_time.tar.gz /home/user/backups

Then scp these files to my local archive server:

bash
scp user@webserver:/home/user/backups/backup_date_time.tar.gz .
1
scp user@webserver:/home/user/backups/backup_date_time.tar.gz .

Automate Backup

This wouldn’t be a code challenge if we didn’t try to automate this process with some scripts.

Requirements:

  • Daily backup
  • Timestamped file names
  • One backup runs at a time
  • Copy to remote archive
  • Checksum the files
  • Delete old backups
  • Log the service with warnings and errors

The following script executes the backups, catches and avoids common warnings and errors, and sets proper permissions.

bash
#!/bin/bash
# Script: run_backup.sh
set -Eeuo pipefail

umask 077

BACKUP_DIR="/root/backups"
TIMESTAMP="$(date +%Y%m%d_%H%M%S)"
ARCHIVE_NAME="backup_${TIMESTAMP}.tar.gz"
ARCHIVE_PATH="${BACKUP_DIR}/${ARCHIVE_NAME}"
CHECKSUM_PATH="${BACKUP_DIR}/${ARCHIVE_NAME}.sha256"
RETENTION_DAYS=14
LOCK_FILE="/var/lock/run_backup.lock"
DISK_SPACE_MIN_KB=524288 # 512MB minimum free space threshold

BACKUP_PATHS=(
  "opt/docker"
  "opt/scripts"
)

# Strings to match against tar's standard error output.
# If a warning contains any of these, it will not trigger a script failure.
ALLOWED_TAR_WARNINGS=(
  "file changed as we read it"
  "socket ignored"
  "file removed before we read it"
)

# Track whether the full backup process completed successfully.
BACKUP_COMPLETE=0

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] [INFO] $*"
}

warn() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] [WARN] $*" >&2
}

error() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] [ERROR] $*" >&2
}

on_error() {
    local exit_code=$?
    local line_no=$1
    local cmd=$2
    error "Backup failed at line ${line_no}: ${cmd} (exit code: ${exit_code})"
    exit "${exit_code}"
}

# On any non-zero exit before successful completion, remove incomplete backup artifacts
# so they are not mistaken for valid backups by future retention runs.
cleanup_on_exit() {
    local exit_code=$?
    if [[ "${exit_code}" -ne 0 && "${BACKUP_COMPLETE}" -eq 0 ]]; then
        warn "Cleaning up incomplete backup artifacts due to failure"
        rm -f "${ARCHIVE_PATH}" "${CHECKSUM_PATH}"
    fi
}

trap 'on_error ${LINENO} "$BASH_COMMAND"' ERR
trap cleanup_on_exit EXIT

require_command() {
    local cmd="$1"
    command -v "${cmd}" >/dev/null 2>&1 || {
        error "Required command not found: ${cmd}"
        exit 1
    }
}

check_source_paths() {
    local path
    for path in "${BACKUP_PATHS[@]}"; do
        if [[ ! -d "/${path}" ]]; then
            error "Configured backup path does not exist or is not a directory: /${path}"
            exit 1
        fi
    done
}

# Verify sufficient free disk space before attempting to create the archive.
check_disk_space() {
    local available_kb
    available_kb=$(df -Pk "${BACKUP_DIR}" | awk 'NR==2 {print $4}')
    if [[ "${available_kb}" -lt "${DISK_SPACE_MIN_KB}" ]]; then
        error "Insufficient disk space in ${BACKUP_DIR}: ${available_kb}KB available (minimum: ${DISK_SPACE_MIN_KB}KB)"
        exit 1
    fi
    local available_mb=$((available_kb / 1024))
    log "Disk space check passed: ${available_mb}MB available"
}

create_lock() {
    mkdir -p "$(dirname "${LOCK_FILE}")"
    exec 9>"${LOCK_FILE}"
    if ! flock -n 9; then
        error "Another backup job is already running"
        exit 1
    fi
}

run_tar_backup() {
    local tar_output
    local tar_rc=0
    local unexpected_warning=0
    local allowed_warning_seen=0
    local is_allowed=0
    local line
    local allowed_str

    log "Creating archive: ${ARCHIVE_PATH}"
    log "Source paths: /${BACKUP_PATHS[*]}"

    set +e
    tar_output="$(tar -czf "${ARCHIVE_PATH}" -C / "${BACKUP_PATHS[@]}" 2>&1)"
    tar_rc=$?
    set -e

    if [[ -n "${tar_output}" ]]; then
        while IFS= read -r line; do
            [[ -z "${line}" ]] && continue
            warn "tar output: ${line}"

            is_allowed=0
            for allowed_str in "${ALLOWED_TAR_WARNINGS[@]}"; do
                if [[ "${line}" == *"${allowed_str}"* ]]; then
                    is_allowed=1
                    allowed_warning_seen=1
                    break
                fi
            done

            if [[ "${is_allowed}" -eq 0 ]]; then
                unexpected_warning=1
            fi
        done <<< "${tar_output}"
    fi

    case "${tar_rc}" in
        0)
            if [[ "${unexpected_warning}" -eq 1 ]]; then
                error "tar reported unexpected warning output despite exit code 0"
                exit 1
            fi
            log "tar completed successfully"
            ;;
        1)
            if [[ "${unexpected_warning}" -eq 1 ]]; then
                error "tar returned exit code 1 with unexpected warnings"
                exit 1
            fi

            if [[ "${allowed_warning_seen}" -eq 1 ]]; then
                warn "tar reported recognized non-fatal warnings; archive creation continued"
            else
                error "tar returned exit code 1 but no recognized non-fatal warning was found"
                exit 1
            fi
            ;;
        *)
            error "tar failed with exit code ${tar_rc}"
            exit "${tar_rc}"
            ;;
    esac
}

validate_archive() {
    if [[ ! -f "${ARCHIVE_PATH}" ]]; then
        error "Archive was not created: ${ARCHIVE_PATH}"
        exit 1
    fi

    if [[ ! -s "${ARCHIVE_PATH}" ]]; then
        error "Archive was created but is empty: ${ARCHIVE_PATH}"
        exit 1
    fi

    gzip -t "${ARCHIVE_PATH}"
    tar -tzf "${ARCHIVE_PATH}" >/dev/null

    log "Archive validation passed"
}

validate_checksum() {
    if [[ ! -f "${CHECKSUM_PATH}" ]]; then
        error "Checksum file was not created: ${CHECKSUM_PATH}"
        exit 1
    fi

    if [[ ! -s "${CHECKSUM_PATH}" ]]; then
        error "Checksum file is empty: ${CHECKSUM_PATH}"
        exit 1
    fi

    (cd "${BACKUP_DIR}" && sha256sum -c "${ARCHIVE_NAME}.sha256" >/dev/null)
    log "Checksum validation passed"
}

cleanup_old_backups() {
    log "Removing backups older than ${RETENTION_DAYS} days from ${BACKUP_DIR}"

    find "${BACKUP_DIR}" -type f \
        \( -name 'backup_*.tar.gz' -o -name 'backup_*.tar.gz.sha256' \) \
        -mtime +"${RETENTION_DAYS}" -print -delete || {
            warn "Retention cleanup encountered an issue"
            return 1
        }
}

main() {
    require_command tar
    require_command gzip
    require_command sha256sum
    require_command flock
    require_command find
    require_command df
    require_command awk

    mkdir -p "${BACKUP_DIR}" || {
        error "Backup directory does not exist and could not be created: ${BACKUP_DIR}"
        exit 1
    }

    if [[ ! -w "${BACKUP_DIR}" ]]; then
        error "Backup directory is not writable: ${BACKUP_DIR}"
        exit 1
    fi

    create_lock
    check_source_paths
    check_disk_space

    log "Backup job started"

    run_tar_backup
    validate_archive

    log "Creating checksum: ${CHECKSUM_PATH}"
    (cd "${BACKUP_DIR}" && sha256sum "${ARCHIVE_NAME}" > "${ARCHIVE_NAME}.sha256")
    validate_checksum

    cleanup_old_backups || true

    log "Securing backup files as read-only"
    chmod 400 "${ARCHIVE_PATH}" "${CHECKSUM_PATH}"

    BACKUP_COMPLETE=1
    log "Backup job completed successfully"
}

main "$@"
  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
#!/bin/bash
# Script: run_backup.sh
set -Eeuo pipefail

umask 077

BACKUP_DIR="/root/backups"
TIMESTAMP="$(date +%Y%m%d_%H%M%S)"
ARCHIVE_NAME="backup_${TIMESTAMP}.tar.gz"
ARCHIVE_PATH="${BACKUP_DIR}/${ARCHIVE_NAME}"
CHECKSUM_PATH="${BACKUP_DIR}/${ARCHIVE_NAME}.sha256"
RETENTION_DAYS=14
LOCK_FILE="/var/lock/run_backup.lock"
DISK_SPACE_MIN_KB=524288 # 512MB minimum free space threshold

BACKUP_PATHS=(
  "opt/docker"
  "opt/scripts"
)

# Strings to match against tar's standard error output.
# If a warning contains any of these, it will not trigger a script failure.
ALLOWED_TAR_WARNINGS=(
  "file changed as we read it"
  "socket ignored"
  "file removed before we read it"
)

# Track whether the full backup process completed successfully.
BACKUP_COMPLETE=0

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] [INFO] $*"
}

warn() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] [WARN] $*" >&2
}

error() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] [ERROR] $*" >&2
}

on_error() {
    local exit_code=$?
    local line_no=$1
    local cmd=$2
    error "Backup failed at line ${line_no}: ${cmd} (exit code: ${exit_code})"
    exit "${exit_code}"
}

# On any non-zero exit before successful completion, remove incomplete backup artifacts
# so they are not mistaken for valid backups by future retention runs.
cleanup_on_exit() {
    local exit_code=$?
    if [[ "${exit_code}" -ne 0 && "${BACKUP_COMPLETE}" -eq 0 ]]; then
        warn "Cleaning up incomplete backup artifacts due to failure"
        rm -f "${ARCHIVE_PATH}" "${CHECKSUM_PATH}"
    fi
}

trap 'on_error ${LINENO} "$BASH_COMMAND"' ERR
trap cleanup_on_exit EXIT

require_command() {
    local cmd="$1"
    command -v "${cmd}" >/dev/null 2>&1 || {
        error "Required command not found: ${cmd}"
        exit 1
    }
}

check_source_paths() {
    local path
    for path in "${BACKUP_PATHS[@]}"; do
        if [[ ! -d "/${path}" ]]; then
            error "Configured backup path does not exist or is not a directory: /${path}"
            exit 1
        fi
    done
}

# Verify sufficient free disk space before attempting to create the archive.
check_disk_space() {
    local available_kb
    available_kb=$(df -Pk "${BACKUP_DIR}" | awk 'NR==2 {print $4}')
    if [[ "${available_kb}" -lt "${DISK_SPACE_MIN_KB}" ]]; then
        error "Insufficient disk space in ${BACKUP_DIR}: ${available_kb}KB available (minimum: ${DISK_SPACE_MIN_KB}KB)"
        exit 1
    fi
    local available_mb=$((available_kb / 1024))
    log "Disk space check passed: ${available_mb}MB available"
}

create_lock() {
    mkdir -p "$(dirname "${LOCK_FILE}")"
    exec 9>"${LOCK_FILE}"
    if ! flock -n 9; then
        error "Another backup job is already running"
        exit 1
    fi
}

run_tar_backup() {
    local tar_output
    local tar_rc=0
    local unexpected_warning=0
    local allowed_warning_seen=0
    local is_allowed=0
    local line
    local allowed_str

    log "Creating archive: ${ARCHIVE_PATH}"
    log "Source paths: /${BACKUP_PATHS[*]}"

    set +e
    tar_output="$(tar -czf "${ARCHIVE_PATH}" -C / "${BACKUP_PATHS[@]}" 2>&1)"
    tar_rc=$?
    set -e

    if [[ -n "${tar_output}" ]]; then
        while IFS= read -r line; do
            [[ -z "${line}" ]] && continue
            warn "tar output: ${line}"

            is_allowed=0
            for allowed_str in "${ALLOWED_TAR_WARNINGS[@]}"; do
                if [[ "${line}" == *"${allowed_str}"* ]]; then
                    is_allowed=1
                    allowed_warning_seen=1
                    break
                fi
            done

            if [[ "${is_allowed}" -eq 0 ]]; then
                unexpected_warning=1
            fi
        done <<< "${tar_output}"
    fi

    case "${tar_rc}" in
        0)
            if [[ "${unexpected_warning}" -eq 1 ]]; then
                error "tar reported unexpected warning output despite exit code 0"
                exit 1
            fi
            log "tar completed successfully"
            ;;
        1)
            if [[ "${unexpected_warning}" -eq 1 ]]; then
                error "tar returned exit code 1 with unexpected warnings"
                exit 1
            fi

            if [[ "${allowed_warning_seen}" -eq 1 ]]; then
                warn "tar reported recognized non-fatal warnings; archive creation continued"
            else
                error "tar returned exit code 1 but no recognized non-fatal warning was found"
                exit 1
            fi
            ;;
        *)
            error "tar failed with exit code ${tar_rc}"
            exit "${tar_rc}"
            ;;
    esac
}

validate_archive() {
    if [[ ! -f "${ARCHIVE_PATH}" ]]; then
        error "Archive was not created: ${ARCHIVE_PATH}"
        exit 1
    fi

    if [[ ! -s "${ARCHIVE_PATH}" ]]; then
        error "Archive was created but is empty: ${ARCHIVE_PATH}"
        exit 1
    fi

    gzip -t "${ARCHIVE_PATH}"
    tar -tzf "${ARCHIVE_PATH}" >/dev/null

    log "Archive validation passed"
}

validate_checksum() {
    if [[ ! -f "${CHECKSUM_PATH}" ]]; then
        error "Checksum file was not created: ${CHECKSUM_PATH}"
        exit 1
    fi

    if [[ ! -s "${CHECKSUM_PATH}" ]]; then
        error "Checksum file is empty: ${CHECKSUM_PATH}"
        exit 1
    fi

    (cd "${BACKUP_DIR}" && sha256sum -c "${ARCHIVE_NAME}.sha256" >/dev/null)
    log "Checksum validation passed"
}

cleanup_old_backups() {
    log "Removing backups older than ${RETENTION_DAYS} days from ${BACKUP_DIR}"

    find "${BACKUP_DIR}" -type f \
        \( -name 'backup_*.tar.gz' -o -name 'backup_*.tar.gz.sha256' \) \
        -mtime +"${RETENTION_DAYS}" -print -delete || {
            warn "Retention cleanup encountered an issue"
            return 1
        }
}

main() {
    require_command tar
    require_command gzip
    require_command sha256sum
    require_command flock
    require_command find
    require_command df
    require_command awk

    mkdir -p "${BACKUP_DIR}" || {
        error "Backup directory does not exist and could not be created: ${BACKUP_DIR}"
        exit 1
    }

    if [[ ! -w "${BACKUP_DIR}" ]]; then
        error "Backup directory is not writable: ${BACKUP_DIR}"
        exit 1
    fi

    create_lock
    check_source_paths
    check_disk_space

    log "Backup job started"

    run_tar_backup
    validate_archive

    log "Creating checksum: ${CHECKSUM_PATH}"
    (cd "${BACKUP_DIR}" && sha256sum "${ARCHIVE_NAME}" > "${ARCHIVE_NAME}.sha256")
    validate_checksum

    cleanup_old_backups || true

    log "Securing backup files as read-only"
    chmod 400 "${ARCHIVE_PATH}" "${CHECKSUM_PATH}"

    BACKUP_COMPLETE=1
    log "Backup job completed successfully"
}

main "$@"

Automation

cron is the usual suspect for these types of jobs, but we are going to use systemd.

  • Service File (/etc/systemd/system/backup-local.service)
ini
[Unit]
Description=Create local backup archive
Documentation=man:tar(1)
After=local-fs.target
ConditionPathIsDirectory=/root/backups

[Service]
Type=oneshot
ExecStart=/root/backups/run_backup.sh
User=root
Group=root
SyslogIdentifier=backup-local
Nice=10
IOSchedulingClass=best-effort
IOSchedulingPriority=7
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
[Unit]
Description=Create local backup archive
Documentation=man:tar(1)
After=local-fs.target
ConditionPathIsDirectory=/root/backups

[Service]
Type=oneshot
ExecStart=/root/backups/run_backup.sh
User=root
Group=root
SyslogIdentifier=backup-local
Nice=10
IOSchedulingClass=best-effort
IOSchedulingPriority=7
  • Timer (/etc/systemd/system/backup-local.timer)
ini
[Unit]
Description=Run local backup daily

[Timer]
OnCalendar=*-*-* 04:47:00
Persistent=true
RandomizedDelaySec=5m

[Install]
WantedBy=timers.target
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
[Unit]
Description=Run local backup daily

[Timer]
OnCalendar=*-*-* 04:47:00
Persistent=true
RandomizedDelaySec=5m

[Install]
WantedBy=timers.target

Then enable, run and verify the new service:

bash
systemctl daemon-reload
systemctl enable --now backup-local.timer
systemctl list-timers --all
systemctl status backup-local.service
systemctl start backup-local.service
journalctl -u backup-local.service -n 100 --no-pager
ls -lh /root/backups
1
2
3
4
5
6
7
systemctl daemon-reload
systemctl enable --now backup-local.timer
systemctl list-timers --all
systemctl status backup-local.service
systemctl start backup-local.service
journalctl -u backup-local.service -n 100 --no-pager
ls -lh /root/backups

Related content