Introduction / Why This Is Needed
Manual backup creation is a tedious and unreliable task. Automation via cron allows you to "forget" about backups, with the confidence that important data (configurations, databases, web projects) is saved regularly. This guide will show you how to set up a simple yet effective backup mechanism on any Linux system.
Requirements / Preparation
Before starting, ensure that:
- You have access to a Linux terminal (via SSH or locally).
- The
cronscheduler is installed (usually present by default). - You have defined:
- What data to back up (e.g.,
/etc/,/var/www/, home directories). - Where to store it (local folder, separate disk, network resource).
- How often (daily, weekly).
- What data to back up (e.g.,
- There is sufficient free space on the target disk.
Step 1: Create a bash script for the backup
Create a script file, for example, /usr/local/bin/backup.sh (requires sudo), or in your home directory ~/scripts/backup.sh.
#!/bin/bash
# Variables
BACKUP_DIR="/var/backups/myapp"
SOURCE_DIRS="/etc /var/www /home"
LOG_FILE="/var/log/backup.log"
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
ARCHIVE="backup_$DATE.tar.gz"
# Create the backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Create the archive with logging
tar -czf $BACKUP_DIR/$ARCHIVE $SOURCE_DIRS 2>> $LOG_FILE
# Check if the archive was created successfully
if [ $? -eq 0 ]; then
echo "[$DATE] SUCCESS: archive $ARCHIVE created." >> $LOG_FILE
else
echo "[$DATE] ERROR: failed to create archive $ARCHIVE." >> $LOG_FILE
exit 1
fi
# (Optional) Delete old backups older than 30 days
find $BACKUP_DIR -type f -name "*.tar.gz" -mtime +30 -delete
Explanation:
tar -czf— creates a compressed tar archive.2>> $LOG_FILE— redirects error messages to the log.find ... -mtime +30 -delete— deletes files older than 30 days.
Give the script execute permissions:
chmod +x ~/scripts/backup.sh
Step 2: Test the script manually
Run the script from your user account (or with sudo if the script is in a system folder):
~/scripts/backup.sh
- Ensure that a file
backup_YYYY-MM-DD_HH-MM-SS.tar.gzappeared in$BACKUP_DIR. - Check the log
tail -f /var/log/backup.logfor errors.
If the script doesn't work — fix paths, permissions, or commands before configuring cron.
Step 3: Add the job to crontab
Open the crontab editor for your user:
crontab -e
Add a line (in this example — daily at 2:00 AM):
0 2 * * * /home/your_user/scripts/backup.sh
Crontab format: minutes hours day_of_month month day_of_week command
0 2 * * *— 2:00 AM every day.- Use the full path to the script.
To run as root (if you need to back up system files), run sudo crontab -e and add the same line there.
Step 4: Configure rotation of old backups
We already added the find ... -delete command to the script itself. You can adjust the retention period by changing +30 to a different number of days (e.g., +7 for one week).
Alternatively, you can configure rotation via logrotate for more flexible management, but for simple backups, find is sufficient.
Step 5: Verify the automatic backup works
- Temporarily change the time in crontab to the next few minutes (e.g.,
*/2 * * * *— every 2 minutes) for testing. - Wait for it to run.
- Check:
- Did a new archive appear in
$BACKUP_DIR? - Was an entry added to
/var/log/backup.log?
- Did a new archive appear in
- Revert to the original schedule.
Result Verification
- Are archives being created? —
ls -lh /var/backups/myapp/ - Are logs being written? —
tail -f /var/log/backup.log - Is cron launching the job? —
grep CRON /var/log/syslog(orjournalctl -u cronon systemd). - Is disk space not running out? —
df -h
If everything works, your system now automatically protects your data.
Potential Issues
"Permission denied" error when creating an archive or writing to the log
- Cause: The user under which cron runs lacks permissions to read the source directories or write to
$BACKUP_DIR/$LOG_FILE. - Solution:
- Run the script as a user with the necessary permissions (e.g., root via
sudo crontab -e). - Or configure permissions on files/directories:
chmodandchown. - Ensure the target directory exists (
mkdir -p).
- Run the script as a user with the necessary permissions (e.g., root via
Archive is created empty or incomplete
- Cause: Incorrect paths specified in
taror insufficient disk space. - Solution:
- Check the
SOURCE_DIRSpaths in the script (the absence of a trailing slash on a directory can affect the archive structure). - Ensure there is enough space on the target disk (
df -h).
- Check the
Cron doesn't run the script, but it works manually
- Cause: Cron uses a minimal environment. The script may use relative paths or variables unavailable in cron.
- Solution:
- In the script, always use full absolute paths (as in the example above).
- If needed, set environment variables within the script itself or in crontab (e.g.,
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin).
Too many old backups, disk fills up
- Cause: The deletion command isn't working or the retention period is too long.
- Solution:
- Check the
findsyntax and the$BACKUP_DIRpath. - Reduce the retention period (the
-mtime +Nparameter). - Consider compressing archives (already using
gzipintar -czf) or moving backups to an external disk/cloud.
- Check the