Script to backup the cPanel account to S3 before the termination
Sometimes we may delete the cPanel account accidentally. May be we will be having the backups enabled but still inorder to get the latest backup before termination, you can use the below script.
- Go to the below path
/usr/local/cpanel/scripts - Create a file with name “prekillacct”
vi prekillacct - Install AWS CLI in the server by following official document of AWS.
- Create IAM user. give appropriate permissions related to s3 and download the credentials.
- Run the “aws configure” command and give the Access-key and secret access key.
- Copy and paste the below script in the file “prekillacct”
#========================================================================== # This script will package an account, move the pkgacct to /backup/suspended-acct-backups. Intended to be hooked as part of cPanel's account termination process. # # Pkgacct # We also copy this stuff to the user's homedir, prior to the backup: # - zones (so we have the IP) # - local vs remotedomains status # #========================================================================== # To do: # add helptext # Add error checking/failing # add exclude, to reduce size of backup. #================= #==================================== # Do we want to skip the backup? -- allows killacct to be run # on the CLI, without creating the backup. #==================================== echo -n "Do you want to skip the backup creation? [y/N] (3 second timeout, default to no) " read -t 3 answer if [ $answer == "y" ] ; then echo "backup aborted" exit 0 else echo "backup not aborted" >> /tmp/prekillacct.log fi #==================================== ME=${0##*/} backupdir=/backup/suspended-acct-backups if [ ! -f "/usr/local/cpanel/cpanel" ]; then echo "This script only works on cPanel servers" exit fi exec > /tmp/prekillacct.log 2>&1 if [[ ! -d "$backupdir" && ! -L "$backupdir" ]]; then echo 'Backup dir does not exist, creating backup directory' mkdir -p $backupdir -v fi #==================================== # Parse argument pairs #==================================== while [ $# != 0 ] do eval cp_$1="$2" shift 2 done eval homedir='~'$cp_user #==================================== # Add external files to the cpmove file: # - Backup a copy of the zone files in ~/.named_hostname # - Backup local/remote domain configuration to #==================================== hostname=$(hostname) namedir=$homedir/.named_$hostname test ! -d $namedir && mkdir $namedir grep ": $cp_user\$" /etc/userdomains | while IFS=":$IFS" read domain u1 do echo backing up $domain zone ... # keep them owned by root cp /var/named/${domain}.db $namedir # backup MX hosting for domain as well ... grep "^$domain$" /etc/localdomains /etc/remotedomains >> $namedir/.mxhost done # adding this in to correct file permissions, else pkgacct won't be able to access these files. chown $cp_user.$cp_user $namedir chown $cp_user.$cp_user $namedir/* #==================================== # Backup the account without overwriting the existing backup, if the existing backup is more than 24 hours old. #==================================== filename=$backupdir/cpmove-${cp_user}.tar.gz if [ -f $filename ] && [ "$(find $output -mtime 0 | wc -l)" -gt 0 ] then echo echo "$ME: $cp_user: Exiting, as backup already exists in destination directory and is less than 24 hours old" echo else /scripts/pkgacct $cp_user $backupdir aws s3 cp /backup/suspended-acct-backups/* s3://bucketname/foldername/ rm -f /backup/suspended-acct-backups/* fi exit 0
7. For testing purpose create an account and terminate it.
8. Once it is terminated, go to s3 and check for the backup. You will see the backup.