This is an old revision of the document!
The script at the bottom of this page will create and download the backup without needing to have SFTP enabled on the router.
You could, of course, create the backup as a cron job on the router itself, and then use the mechanism applied in the script below to download the backup file. However, let's assume you want everything done in just one run of the script. To achieve this, the script covers creation of the backup in an individual file with timestamp and download.
In this way, just one run of the script on the backup server will create the backup and download it to to a safe location.
The backup is created using the “nvram save” command. This is how backups are done “under the hood” in the web interface.
You may cross-check that the backups are identical to the ones via the GUI using the following steps (last tested at 2025-04-22):
“nvram convert <filename1/2> > result_file1/2.txt”
The script then archives the resulting data in a tar file and sends it through the netcat command, transferring it over the network. Restore a backup file uses the same procedure as restoring an archive created in the web interface. This was last tested on 2025-04-29.
#!/bin/bash
DATE_REGEX=20[0-9][0-9][01][0-9][0123][0-9]
TIME_REGEX=[012][0-9][0-5][0-9]
NO_OF_DIFF_FILES_TO_BE_KEPT=10
DIR2BACKUP=/home/${LOGNAME}/Router_Backups
LOCAL_ID_FILE=/home/${LOGNAME}/.ssh/id_tomato_ecdsa
ROUTER=`ip r | grep default | head -1 | cut -d " " -f 3`
while [[ $# -gt 0 ]]; do
case $1 in
-d|--dir2backup)
DIR2BACKUP=$2 ;;
-i|--idfile)
LOCAL_ID_FILE=$2 ;;
-r|--router)
ROUTER=$2 ;;
-*|--*)
echo "Unknown option $1" ;;
-h|--help)
echo "usage $0 <option argument>
echo "options are:"
echo " -d \| --dir2backup directory to save backup, default: ${DIR2BACKUP} "
echo " -r \| --router router ostname or IP to be saved, default: ${ROUTER}"
echo " -i \| --idfile directory to save backuip, default: ${ROUTER}
exit ;;
esac
shift; shift
done
BACKUPHOST=`ip r | grep "${ROUTER%1}0/24" | head -1 | cut -d " " -f 9`
echo
echo "Backup dir: "${DIR2BACKUP}
echo "id file: "${LOCAL_ID_FILE}
echo "Backing up: "${ROUTER}
echo "Backup host: "${BACKUPHOST}
echo
USER=root
PORT=5555
SCRIPT_FILE=nvram_save_cfg.sh
PREFIX=FreshTomato
EXT=cfg
TRANSFER_FILENAME=config.tar
# It may be, that Tomato router has no - or other, wrong - timeother or no timee
# take date from localhost (i.e. backupserver) into backup filename
DATE=`date +%Y%m%d_%H%M`
pushd ${DIR2BACKUP} > /dev/null
rm -f ${TRANSFER_FILENAME}
#
# Thinks like
# VAR=`nvram get os_version`
# seem not to work in bash via here doc, so write results into script file and source it
# Further the individual filename is general not known, so tar it into temp file
#
# Kill netcat zombies
kill -9 `ps -ef | grep -v grep | grep netcat | sed -e "s/ [ ]*/ /g" | cut -d " " -f 2` 2> /dev/null
(netcat -l -p ${PORT} > ${TRANSFER_FILENAME}) &
ssh ${USER}@${ROUTER} -i ${LOCAL_ID_FILE} <<ENDSSH
rm -f ${SCRIPT_FILE} ${TRANSFER_FILENAME} ${PREFIX}_*_${DATE_REGEX}_${TIME_REGEX}.${EXT}
echo "nvram save ${PREFIX}" >> ${SCRIPT_FILE}
nvram get os_version | sed -e "s/ .*$//" >> ${SCRIPT_FILE}
echo "on" >> ${SCRIPT_FILE}
nvram get t_model_name | tr " " "_" >> ${SCRIPT_FILE}
nvram get router_name >> ${SCRIPT_FILE}
echo ${DATE} >> ${SCRIPT_FILE}
sed -e "N;N;N;N;N;s/\n/_/g;s/$/.${EXT}/" -i ${SCRIPT_FILE}
source ${SCRIPT_FILE}
tar -cvf ${TRANSFER_FILENAME} ${PREFIX}_*_${DATE_REGEX}_${TIME_REGEX}.${EXT} > /dev/null
cat ${TRANSFER_FILENAME} | nc ${BACKUPHOST} ${PORT}
sleep 5 # just wait a little bit before deleting the files not needed here any more
rm -f ${SCRIPT_FILE} ${TRANSFER_FILENAME} ${PREFIX}_*_${DATE_REGEX}_${TIME_REGEX}.${EXT}
ENDSSH
THIS_BACKUP_FILE=`tar -xvf ${TRANSFER_FILENAME} | sed -e "s/^.*${PREFIX}/${PREFIX}/"`
if [ "${THIS_BACKUP_FILE}" ]; then
echo "Saved on this computer in `pwd`:"
echo ${THIS_BACKUP_FILE}
echo
TODAYS_BACKUP_FILES_PREFIX=${THIS_BACKUP_FILE%_*}
ALL_BACKUP_FILES_PREFIX=${TODAYS_BACKUP_FILES_PREFIX%_*}
# Keep only one file (the latest) per day - delete earlier file of same day
LIST_OLD_BACKUPS_OF_TODAY=`ls ${TODAYS_BACKUP_FILES_PREFIX}_${TIME_REGEX}.${EXT} 2> /dev/null | grep -v ${THIS_BACKUP_FILE}`
if [ "${LIST_OLD_BACKUPS_OF_TODAY}" ]; then
echo "deleting earlier backups from today (to keep just one per day - the most recent):"
rm -fv ${LIST_OLD_BACKUPS_OF_TODAY}
echo
fi
# In total, keep only ${NO_OF_DIFF_FILES_TO_BE_KEPT} files - delete older files (of any day)
OLDER_FILES=`ls -t ${ALL_BACKUP_FILES_PREFIX}_${DATE_REGEX}_${TIME_REGEX}.${EXT} | sed -e 1,${NO_OF_DIFF_FILES_TO_BE_KEPT}d`
if [ "${OLDER_FILES}" ]; then
echo "keep only ${NO_OF_DIFF_FILES_TO_BE_KEPT} in total, delete:"
rm -fv ${OLDER_FILES}
echo
fi
# delete transfer file only when tar was successful, i.e. only here
rm ${TRANSFER_FILENAME}
else
echo "unknown error while untaring"
fi
popd > /dev/null