Home > Amazon Web Services, Jamf Pro, macOS, Scripting > Backing up the contents of an AWS-hosted Jamf Pro cloud distribution point to a local directory

Backing up the contents of an AWS-hosted Jamf Pro cloud distribution point to a local directory

As part of removing unused packages from a Jamf Pro cloud distribution point using @shea_craig‘s Spruce tool, I needed to first make a backup of the contents of the cloud distribution point to a local directory on my Mac. That way, in case I had made an error and deleted the wrong installer package, I had a copy of the package readily available and could re-add the package back to my Jamf Pro server.

The cloud distribution point in question is hosted out in Amazon Web Services’ (AWS) S3 service, so I decided to use AWS’s awscli command line tool‘s S3 functions to run a one-way synchronization process between the cloud distribution point in S3 and my local directory. For more details, please see below the jump.

My first step was installing and configuring the awscli tool. Once I had the awscli tool installed and configured with the appropriate AWS access key, AWS secret key and AWS region, I ran the following command:

aws s3 ls --summarize --human-readable --recursive s3://S3-bucket-name-goes-here/

The aws s3 ls command shown above allows me to do the following:

  1. Verify that I could access the S3 bucket which my Jamf Pro server is using as a cloud distribution point.
  2. List the contents of the bucket
  3. Summarize the total number of objects and total size of all objects in the bucket.


computername:~ username$ aws s3 ls –summarize –human-readable –recursive s3://jamf96d3041a98a94a6099c271c860243afe/
2017-12-17 02:03:59 3.7 MiB Microsoft_AutoUpdate_3.14.17121501.pkg
2018-01-24 02:02:43 3.7 MiB Microsoft_AutoUpdate_3.15.18011602.pkg
2018-02-14 02:18:42 779.0 MiB Microsoft_Excel_16.10.18021001.pkg
2018-01-19 02:23:57 768.9 MiB Microsoft_Excel_16.9.18011602.pkg
2017-05-23 16:30:29 38.5 MiB Microsoft_Lync_14.4.3.pkg
2018-02-14 02:14:53 1.6 GiB Microsoft_Office_16.10.0.pkg
2018-01-19 02:19:08 1.6 GiB Microsoft_Office_16.9.0.pkg
2017-12-13 02:35:06 30.4 MiB Microsoft_OneDrive_17.3.7131.pkg
2018-02-02 02:15:11 30.5 MiB Microsoft_OneDrive_17.3.7294.pkg
2018-02-14 02:26:22 405.9 MiB Microsoft_OneNote_16.10.18021001.pkg
2018-01-19 02:32:09 401.0 MiB Microsoft_OneNote_16.9.18011602.pkg
2018-02-14 02:24:12 870.0 MiB Microsoft_Outlook_16.10.18021001.pkg
2018-01-19 02:29:57 850.2 MiB Microsoft_Outlook_16.9.18011602.pkg
2018-02-14 02:31:00 711.3 MiB Microsoft_PowerPoint_16.10.18021001.pkg
2018-01-19 02:36:46 704.2 MiB Microsoft_PowerPoint_16.9.18011602.pkg
2018-01-27 02:21:41 704.2 MiB Microsoft_PowerPoint_16.9.18012504.pkg
2017-05-23 16:31:18 657.9 KiB Microsoft_SharePoint_2016_15.32.17030901_SharePointPlugin.pkg
2017-11-20 01:58:46 657.9 KiB Microsoft_SharePoint_Plugin_15.32.17030901.pkg
2017-11-10 02:13:43 30.4 MiB Microsoft_Skype_For_Business_16.12.77.pkg
2017-12-09 02:14:27 30.5 MiB Microsoft_Skype_For_Business_16.13.184.pkg
2018-02-01 02:16:30 30.5 MiB Microsoft_Skype_For_Business_16.14.163.pkg
Total Objects: 21
Total Size: 10.5 GiB
computername:~ username$

view raw

gistfile1.txt

hosted with ❤ by GitHub

To run the backup, I’m using the following aws s3 sync command:

aws s3 sync --delete s3://S3-bucket-name-goes-here/ /path/to/backup/directory

To help automate the sync process, I’ve written a script which does the following:

  1. If necessary, creates a log file named dp_sync.log and stores it in /var/log/
  2. If necessary, creates the local directory to store the files downloaded from the cloud distribution point.
  3. Performs a one-way synchronization of the cloud distribution point with the local directory, so that the local backup directory only contains the contents of the specified S3 bucket.
  4. Sets all downloaded files to be world-readable.
  5.  Logs all actions to /var/log/dp_sync.log.

The script is also available on GitHub via the link below:

https://github.com/rtrouton/aws_scripts/tree/master/s3_bucket_backup_to_local_directory


#!/bin/bash
# Script backs up the contents of an S3 bucket hosted
# in Amazon Web Services to a local directory.
#
# Pre-requisites:
# Script assumes that the AWS CLI tool has been installed
# and configured with the following:
#
# AWS access key
# AWS secret key
# Correct AWS region for the S3 bucket specified below.
# Specify the name of the S3 bucket
s3_bucket="S3-bucket-name-goes-here"
# Location of the local directory used
# to store the downloaded files.
local_directory="/path/to/backup/directory"
# Set up logging format
log_location="/var/log/dp_sync.log"
ScriptLogging(){
DATE=`date +%Y-%m-%d\ %H:%M:%S`
LOG="$log_location"
echo "$DATE" " $1" >> $LOG
}
# Verify that /var/log/dp_sync.log is present and create it
# if /var/log/dp_sync.log is not present.
if [[ ! -f "$log_location" ]]; then
touch "$log_location"
fi
# Verify that the local backup directory is present
# and create it if is not present.
if [[ ! -d "$local_directory" ]]; then
mkdir -p "$local_directory"
fi
# Perform a one-way synchronization from the S3 bucket to the local backup directory, so that the
# local backup directory only contains the contents of the specified S3 bucket.
#
# Once the S3 bucket's contents have been synchronized, all files and directories are set to be
# world-readable. All output should be logged to /var/log/dp_sync.log.
ScriptLogging "Starting syncronization"
aws s3 sync –delete s3://"$s3_bucket"/ "$local_directory" >> "$log_location" 2>&1
chmod -R 755 "$local_directory"
ScriptLogging "Syncronization complete"

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: