Home > Amazon Web Services, Jamf Pro, macOS, Scripting > Backing up the contents of an AWS-hosted Jamf Pro cloud distribution point to a local directory

Backing up the contents of an AWS-hosted Jamf Pro cloud distribution point to a local directory

As part of removing unused packages from a Jamf Pro cloud distribution point using @shea_craig‘s Spruce tool, I needed to first make a backup of the contents of the cloud distribution point to a local directory on my Mac. That way, in case I had made an error and deleted the wrong installer package, I had a copy of the package readily available and could re-add the package back to my Jamf Pro server.

The cloud distribution point in question is hosted out in Amazon Web Services’ (AWS) S3 service, so I decided to use AWS’s awscli command line tool‘s S3 functions to run a one-way synchronization process between the cloud distribution point in S3 and my local directory. For more details, please see below the jump.

My first step was installing and configuring the awscli tool. Once I had the awscli tool installed and configured with the appropriate AWS access key, AWS secret key and AWS region, I ran the following command:

aws s3 ls --summarize --human-readable --recursive s3://S3-bucket-name-goes-here/

The aws s3 ls command shown above allows me to do the following:

  1. Verify that I could access the S3 bucket which my Jamf Pro server is using as a cloud distribution point.
  2. List the contents of the bucket
  3. Summarize the total number of objects and total size of all objects in the bucket.

To run the backup, I’m using the following aws s3 sync command:

aws s3 sync --delete s3://S3-bucket-name-goes-here/ /path/to/backup/directory

To help automate the sync process, I’ve written a script which does the following:

  1. If necessary, creates a log file named dp_sync.log and stores it in /var/log/
  2. If necessary, creates the local directory to store the files downloaded from the cloud distribution point.
  3. Performs a one-way synchronization of the cloud distribution point with the local directory, so that the local backup directory only contains the contents of the specified S3 bucket.
  4. Sets all downloaded files to be world-readable.
  5.  Logs all actions to /var/log/dp_sync.log.

The script is also available on GitHub via the link below:

https://github.com/rtrouton/aws_scripts/tree/master/s3_bucket_backup_to_local_directory

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: