Conveyoring Files for Backup Cloudberry Backup With Prepost Scripts
cloudberrylab
Table of contents:
Working with customers is always great experience and obviously it is one of the biggest way to learn new things. One of the case I’ve been working on (being CloudBerry Lab solutions architect member) had interesting requirements. Customer wanted to offload all their video data (exported surveillance content from their cameras) off to Amazon S3 with Infrequent Access storage class. This is very simple, but the interesting part is that amount of data depended on the time of the day and .. all these files are exported by third-party to certain directory on the server with cloud backup tool and you never knew their total size and more over you see those files (names), but you couldn’t touch them until they became over certain size. Last statement gave me really good strategy and the following iterations came up on board as acceptable solution.
- Files are generated by video exporter tool (another server) and copied through network to backup server to specific directory (Export). Script is triggered by pre-plan action in CBL Backup;
- Generated files are scanned by windows script in the folder and copied to another folder (Backup) when they were bigger certain size (can be controlled by script);
- Backup tool (CBL Backup) offloaded data to S3 bucket and sets the object class to IA (Infrequent Access) to reduce the overall storage price (twice less than with Standard class);
- On successful upload another script (triggered by post-plan action) cleaned up the source folder (Backup) and were waiting for its next run (where the above carousel started its circle again).
Let’s see all these steps in details.
Windows script to copy files larger certain size #
This script is based on file size condition and does simple action - move files from Export to Backup. Let’s create pre-plan.bat and place the following:
1FOR /R "C:\POC\Export" %%F in (*) do if %%~zF geq 1000000000 move "%%F" "C:\POC\Backup"
We have C:\POC\Export where the data arrives from video software and fills files by content. C:\POC\Backup is the folder that CloudBerry Backup uses as source folder. So using this script we take all files in the video temp folder, measure them with ~ 1Gb (1000000000) and move if there are greater. This script is based on standard windows command. It can be done by PowerShell if you need so.
Clean up script #
The next thing we need is to create clean up script that simply deletes all files in the folder with certain size (well actually we can delete them all) for next iteration (backup). The overall plan was to offload to the cloud storage and delete everything locally.
So the script is super simple, let’s create post-plan.bat and place the following:
1cd C:\POC\Backup\
2del *
Backup Pre / Post actions in action #
Final step is to configure Backup tool with two scripts added in the wizard to do its custom things.
Lastly, we need to consider backup schedule. We set hourly in this particular case it worked fine!
This is it really! I haven’t founded better way to do this, but if there any - would be happy to go through!