I thought I would share this simple script that I wrote for my web server to back it up every night.

This simple python script:

  • Creates a temporary directory to copy everything to
  • Copies everything in my home directory
  • Copies everything from my root web directory
  • Copies my php.ini file (I hate resetting up my php.ini file)
  • Dumps all of the databases that I may have in MariaDB into a nice SQL dump file
  • Tars everything up into a single file
  • Then uses one of my favorite Linux CLI tools, RClone to copy the tar file to my Google Drive
  • Cleans up the temp directory that it created

import os, time


os.popen("mkdir "+p)
os.popen("cp -r /root "+p)
os.popen("cp -r /var/www/ "+p)
os.popen("cp /etc/php.ini "+p)
os.popen("mysqldump -u root -pPassword > "+p+"sql.sql --all-databases")
os.popen("tar -zcvf /var/backup.tgz "+p)
os.popen("/root/rclone/rclone copy /var/backup.tgz g:/Backups/WebServers/"+c+"/"+f+"/")
os.popen("rm -rf "+p)
os.popen("rm -rf /var/backup.tgz")

I then just schedule the script to run every night by scheduling it as a cron job.

Also, just in case you’re wondering, the reason that I assign the file name as the value of the current epoch time is so each backup file has a unique file name. I keep all of my backups forever so the last thing I want is overlapping file names and accidentally overwriting old backups.