This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| backup_script [2025/04/24 18:16] – typos thilo | backup_script [2025/12/01 12:45] (current) – [Remarks] thilo | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | Do not know whether this is the right place to publish this, just wanted to share..... | + | ====== Backup Script ====== |
| + | ==== Goals ==== | ||
| - | I want to schedule backup by cron job, initiated by and stored at a save place on any server in the network, not just on the router. | + | \\ |
| - | Further, I do not want to install SFTP just for the backups. | + | - You want to schedule a backup to be done via a cron job. |
| + | - You want the script initiated from, and stored in a safe location | ||
| + | - You do not wish to install SFTP just for these backups. | ||
| - | Did not find a solution in the wiki/ | + | \\ |
| - | So I did a little | + | The script |
| - | Option to create the backup | + | You could, of course, |
| - | Thus, creation | + | In this way, just one run of the script on the backup |
| - | So it is just one run of the script on the backup server to create | + | \\ \\ The backup |
| - | Action is based on using a here doc to execute commands on the router. | + | You may cross-check that the backups are identical |
| - | The backup icreated by by "mvram save". This is how the backups are created in the GUI, you may cross check by "nvram convert" | + | \\ |
| - | Then it tars the file and sends through netcat. | + | - Download |
| + | - Create a backup via a script | ||
| + | - Copy both files to router | ||
| + | - Convert both files via the command: \\ ''" | ||
| + | - Perform a diff between the two resulting text files. | ||
| - | Prerequisites: netcat | + | \\ |
| + | |||
| + | The script then archives the resulting data in a tar file and sends it through the netcat command, transferring it over the network. Restore a backup file uses the same procedure as restoring an archive created in the web interface. This was last tested on 2025-04-29. | ||
| + | |||
| + | \\ | ||
| + | |||
| + | |||
| + | ==== Prerequisites | ||
| + | |||
| + | \\ | ||
| + | |||
| + | - The netcat | ||
| + | - The user executing the script on backup server must have access to the router' | ||
| + | |||
| + | \\ | ||
| + | |||
| + | ==== Remarks ==== | ||
| + | |||
| + | \\ | ||
| + | |||
| + | - Command line arguments: backup directory, id-file and router | ||
| + | - Sometimes tar may fail. You can solve this by just cleaning. The next \\ cron run may do that. | ||
| + | - All earlier backups from the same day are abandoned. | ||
| + | - A total number of backups is kept. This number is configurable. \\ Older backups are deleted. | ||
| + | |||
| + | \\ | ||
| - | Script starts here: | ||
| - | ============================================== | ||
| # | # | ||
| - | USER=root | + | DATE_REGEX=20[0-9][0-9][01][0-9][0123][0-9] |
| + | | ||
| + | | ||
| + | |||
| + | | ||
| | | ||
| + | | ||
| + | |||
| + | while [[ $# -gt 0 ]]; do | ||
| + | case $1 in | ||
| + | -d|--dir2backup) | ||
| + | DIR2BACKUP=$2 ;; | ||
| + | -i|--idfile) | ||
| + | LOCAL_ID_FILE=$2 ;; | ||
| + | -r|--router) | ||
| + | ROUTER=$2 ;; | ||
| + | -*|--*) | ||
| + | echo " | ||
| + | -h|--help) | ||
| + | echo "usage $0 <option argument> | ||
| + | echo " | ||
| + | echo " | ||
| + | echo " | ||
| + | echo " | ||
| + | exit ;; | ||
| + | esac | ||
| + | shift; shift | ||
| + | done | ||
| + | |||
| + | | ||
| + | echo | ||
| + | echo " | ||
| + | echo "id file: " | ||
| + | echo " | ||
| + | echo " | ||
| + | echo | ||
| + | | ||
| | | ||
| - | | ||
| | | ||
| | | ||
| - | EXT=.cfg | + | |
| | | ||
| - | ROUTER=`ip r | grep default | head -1 | cut -d " " -f 3` | + | # It may happen, that Tomato router has no - or other, wrong - time |
| - | pushd ${BACKUP_DIR} | + | # take date from localhost (i.e. backup server) into backup filename |
| + | | ||
| - | (netcat -l -p ${PORT} > ${TRANSFER_FILENAME}) & | + | pushd ${DIR2BACKUP} > /dev/null |
| + | |||
| + | rm -f ${TRANSFER_FILENAME} | ||
| # | # | ||
| - | # | + | # |
| - | # VAR=`nvram get os_version` | + | # VAR=`nvram get os_version` |
| - | # seem not to work in bash via here doc, so write results into script file and source it | + | # seem not to work in bash via here doc, so write results into script file and source it. |
| - | # Further the individual filename is general | + | # Further the individual filename is generally |
| # | # | ||
| - | ssh ${USER}@${ROUTER} -i ${LOCAL_ID_FILE}<< | + | # Kill netcat zombies |
| - | | + | kill -9 `ps -ef | grep -v grep | grep netcat | sed -e "s/ [ ]*/ /g" | cut -d " " -f 2` 2> /dev/null |
| - | | + | |
| - | | + | # Create the backup file on the router by executing the following commands (indented lines) there |
| - | | + | ssh ${USER}@${ROUTER} -i ${LOCAL_ID_FILE} << |
| - | | + | rm -f ${SCRIPT_FILE} ${TRANSFER_FILENAME} ${PREFIX}_*_${DATE_REGEX}_${TIME_REGEX}.${EXT} |
| - | | + | echo "nvram save ${PREFIX}" |
| - | date +%Y%m%d_%H%M | + | nvram get os_version | sed -e "s/ .*$//" >> ${SCRIPT_FILE} |
| - | | + | echo " |
| - | cat ${SCRIPT_FILE} | + | nvram get t_model_name | tr " " " |
| - | source ${SCRIPT_FILE} | + | nvram get router_name >> ${SCRIPT_FILE} |
| - | | + | echo ${DATE} |
| - | | + | sed -e " |
| - | | + | source ${SCRIPT_FILE} |
| + | tar -cvf ${TRANSFER_FILENAME} | ||
| + | cat ${TRANSFER_FILENAME} | nc ${BACKUPHOST} | ||
| + | sleep 5 # just wait a little while before deleting the files not needed here any more | ||
| + | | ||
| | | ||
| - | tar -xvf ${TRANSFER_FILENAME} | + | THIS_BACKUP_FILE=`tar -xvf ${TRANSFER_FILENAME} |
| - | rm ${TRANSFER_FILENAME} | + | if [ " |
| - | popd | + | echo "Saved on this computer in `pwd`:" |
| + | echo ${THIS_BACKUP_FILE} | ||
| + | echo | ||
| + | TODAYS_BACKUP_FILES_PREFIX=${THIS_BACKUP_FILE%_*} | ||
| + | ALL_BACKUP_FILES_PREFIX=${TODAYS_BACKUP_FILES_PREFIX%_*} | ||
| + | # Keep only one file (the latest) per day - delete earlier file of same day | ||
| + | LIST_OLD_BACKUPS_OF_TODAY=`ls ${TODAYS_BACKUP_FILES_PREFIX}_${TIME_REGEX}.${EXT} 2> /dev/null | grep -v ${THIS_BACKUP_FILE}` | ||
| + | if [ " | ||
| + | echo " | ||
| + | rm -fv ${LIST_OLD_BACKUPS_OF_TODAY} | ||
| + | echo | ||
| + | fi | ||
| + | # In total, keep only ${NO_OF_DIFF_FILES_TO_BE_KEPT} files - delete older files (of any day) | ||
| + | OLDER_FILES=`ls ${ALL_BACKUP_FILES_PREFIX}_${DATE_REGEX}_${TIME_REGEX}.${EXT} | sort -r | sed -e 1, | ||
| + | if [ " | ||
| + | echo "keep only ${NO_OF_DIFF_FILES_TO_BE_KEPT} in total, delete:" | ||
| + | rm -fv ${OLDER_FILES} | ||
| + | echo | ||
| + | fi | ||
| + | # delete transfer file only when tar was successful, i.e. only here | ||
| + | | ||
| + | | ||
| + | echo " | ||
| + | fi | ||
| + | | ||
| + | \\ | ||
| + | \\ | ||