"BashFriday Chronicles: Backups to Azure Blob Storage with Style"

"BashFriday Chronicles: Backups to Azure Blob Storage with Style"

Table of contents

Are you ready for another exciting BashFriday adventure? This week, we've cooked up something truly exciting - a backup script that not only safeguards your precious Linux directories and files but also zips them off to the cloud in Azure Blob Storage. So, without further ado, let's dive into the world of Linux backup wizardry!

Objective: In this thrilling episode of BashFriday, our goal is to create a robust backup mechanism that automatically backs up our Linux directories and files every midnight. The backed-up data will be stored securely in Azure Blob Storage. We want a fully automated process that takes care of everything while we sleep soundly.

Ingredients:

  • Bash scripting skills

  • A Linux system

  • An Azure Blob Storage account

  • An Azure Blob Storage container

  • A pinch of creativity and curiosity

Recipe:

  • Create Date-Named Directories: First, we create two directories with the current date in the name. These directories, named src<current_date> and dest<current_date>, will hold our source files and the destination files for backup.

  •     current_date=$(date +"%d-%m-%Y")
        src="src$current_date"
        dest="dest$current_date"
    
  • Generate a List of Files and Directories: To decide what to back up, we create a list of files and directories sorted by their modification time. This is done using the ls -t command, and the list is stored in a file called files_and_dirs.txt.

  •     mkdir -p "master_backup_2023/$src"
        touch "master_backup_2023/$src/files_and_dirs.txt"
        # Create a file with a list of files and directories sorted by modification time
        ls -t > "master_backup_2023/$src/files_and_dirs.txt"
    
  • Categorize and Copy: Next, we sort through the list of files and directories. For directories, we use rsync to maintain the directory structure and copy them to dest<current_date>/dirs_backup. For individual files, we copy them to dest<current_date>/files_backup.

  •     while IFS= read -r entry
        do
          if [ -d "$entry" ]; then
            # Copy directories and maintain the directory structure
            rsync -av --exclude="files_and_dirs.txt" "$entry/" "master_backup_2023/$dest/dirs_backup/"
          else
            # Copy files to the files_backup directory
            cp "$entry" "master_backup_2023/$dest/files_backup/"
          fi
        done < "master_backup_2023/$src/files_and_dirs.txt"
    
  • Azure CLI Magic: Now, here comes the real magic! We summon the Azure CLI to upload our precious backup to an Azure Blob Storage container. Using the az storage blob upload-batch command, we send our data securely to the cloud.

  •     # Define your Azure Blob Storage container and storage account information
        container_name="your_container_name"
        storage_account_name="your_storage_account_name"
        storage_account_key="your_storage_account_key"
    
        az storage blob upload-batch --destination $container_name --source "$backup_dir/$dest" --type block --account-name $storage_account_name --account-key $storage_account_key
    
        # Clean up the local backup
        rm -r "$backup_dir/$src"
        rm -r "$backup_dir/$dest"
    
  • Clean-Up Time: Once our data is safely tucked away in Azure Blob Storage, we perform some housekeeping. We remove the temporary directories and files to keep our system tidy and our disk space free.

Serve with a Side of Automation:

In true BashFriday spirit, our backup script is fully automated. It springs into action every midnight to create backups without us lifting a finger. With the power of rsync, we ensure that our directory structures are preserved during the journey to the Azure cloud.

crontab -e
0 0 * * * /path/to/your/cloudbackup.sh

Project Structure

Project Structure
├── BashFriday Project (root directory)
│   ├── cloud_backup.sh (The main backup script)
│   ├── master_backup_2023 (The main backup directory)
│   │   ├── src<current_date> (Source directory with the current date)
│   │   │   ├── files_and_dirs.txt (List of files and directories)
│   │   ├── dest<current_date> (Destination directory with the current date)
│   │   │   ├── dirs_backup (Directory backups)
│   │   │   │   ├── (Backup of directories)
│   │   │   ├── files_backup (File backups)
│   │   │   │   ├── (Backup of individual files)

Why Azure Blob Storage?

Azure Blob Storage is our cloud sanctuary for backup. It provides scalability, durability, and cost-effectiveness for storing our critical data. Plus, Azure's global reach means our backups are never far away when we need them.

Conclusion: This week's BashFriday project has equipped us with a supercharged backup mechanism. It not only safeguards our Linux directories and files but also harnesses the power of the Azure cloud for secure, off-site storage. As we sip our morning coffee, our backup script works tirelessly to ensure our data is safe and sound.

View it on GitHub here

github

With our backup script, we've taken a leap forward in ensuring the integrity and availability of our data. The combination of Bash scripting and cloud technology has opened up a world of possibilities for our BashFriday adventures.

Remember, in the world of BashFriday, every week is a new adventure. Stay curious, stay creative, and keep those Bash skills sharp!

Happy BashFriday, fellow script adventurers! 🚀🌟

Cheers 😃👏.