On some occasions, it’s useful to be able to make a full backup of a system on an ad-hoc basis. One example would be making a complete backup of a Mac’s boot drive before sending it in to Apple for a repair, as Apple may swap out or erase the Mac’s existing boot drive as part of the repair process if their tools indicate a drive problem.
When I’ve needed to do this, I’ve used DeployStudio for this task. The reason why is that DeployStudio includes the ability to do the following:
- Create an asr-ready disk image from a Mac’s boot drive containing the OS and all other data.
- Restore the disk image to an available volume on the same or different Mac, and setting the target volume to be bootable.
These capabilities were originally designed to allow monolithic images to be created from one Mac for distribution to other Macs, but these capabilities also allow DeployStudio to create on-demand backups of a Mac’s boot drive. For more details, see below the jump.
I don’t normally try to foretell the future but there is one change for Mac admins that I’m pretty sure will happen:
The coming of Apple File System (APFS) will mark the end of disk imaging on Macs.
For those not familiar with disk imaging, a disk image is a computer file containing the contents and structure of a disk volume. Mac disk images are applied to hard drives using the Apple Software Restore (asr) command line utility to erase the destination drive and then block-copy the data from the disk image onto the destination drive.
Mac deployment practices have generally fallen into one of three categories:
Monolithic imaging is the practice of building a Mac with the desired operating system, desired software, and desired configuration settings, then creating a disk image which includes all the contents of that Mac’s boot drive, including the operating system, installed software, and settings.
Once that disk image is created, the image is then applied to multiple other Macs to make them just like the original Mac.
Modular imaging is the practice of creating a disk image that contains only the base OS (as well as necessary OS updates from Apple).
Once that disk image is created, the image is applied to multiple other Macs. Desired software and desired configuration settings are then installed onto the newly-imaged Mac as post-imaging deployment tasks.
Thin imaging is technically not an imaging practice, as no disk image is involved. Instead, the assumption is that Macs from Apple come with a pre-installed OS and that OS should be used instead of wiping it and replacing it with a new copy from a disk image.
In this scenario, a deployment workflow is run which installs the desired software and desired configuration settings onto the Mac. If a Mac needs to be wiped and re-setup, a fresh copy of the OS is installed via the Recovery environment or similar OS installation process and then the thin imaging deployment workflow is re-run.
Imaging using asr has been around for a long time (I first began using it back in the Mac OS X 10.2.x days) but there have been strong hints that those days are coming to an end. The most visible of these was this tweet from the makers of DeployStudio:
While the makers of DeployStudio don’t speak for Apple, a statement like this matches up with what I’ve heard from other Mac admins who have independently received similar messages as part of their communication with Apple. Apple hasn’t commented publicly one way or the other, so unfortunately I can’t be more specific than that.
If imaging isn’t available, what are the alternatives? Apple has been encouraging the use of Apple’s Device Enrollment Program, which leverages a company, school or institutions’ mobile device management (MDM) service. In this case, you would need to arrange with Apple or an Apple reseller to purchase Macs that are enrolled in your organization’s DEP.
When a DEP-enrolled Mac is started for the first time (or started after an OS reinstall), it is automatically configured to use your organizations’ MDM service and the device checks in with the MDM service. The MDM service then configures the Mac as desired with your organization’s software and configuration settings. A good example of what this process may look like can be seen here.
What if you don’t have DEP, or you don’t have MDM? In that case, you may still be able to leverage a thin imaging deployment workflow, which installs the desired software and desired configuration settings onto the Mac’s existing OS. To get an existing OS though, you would need to install it via the Recovery environment or a similar OS installation process.
Planning for the future
Today, imaging works and our deployment workflows are what they are. What should be done to prepare for the future?
If you’re already using DEP with MDM to set up your Macs:
- Congratulations! You’re good to go with a Apple-supported deployment workflow that should work fine for the foreseeable future.
If you’re not using DEP with MDM to set up your Macs:
- If DEP is an option for your organization and you have an existing MDM service, investigate using Apple’s DEP service to set up your Macs for deployment. You may find that DEP doesn’t work for you in its current form, but now is the time to find that out and work with Apple to get those parts fixed.
- If DEP isn’t an option for your organization (because you aren’t using MDM and/or you aren’t in a country where DEP is supported) and you aren’t using a thin imaging deployment workflow now, I recommend investing the time and effort to start using a thin imaging workflow. In particular, if you are using monolithic imaging to set up your Macs, it is time to stop and transition to an alternate way of deploying Macs before that imaging method abruptly stops working.
When will we know how long imaging has left? My recommendation will be to watch what Apple reveals at this summer’s WWDC 2017 conference and pay particular attention to any device management or APFS developments that are being announced, as those announcements should likely provide the best information.
As part of starting my new position, I’m transitioning from a job where I’m going to work at an office to a work-from-home position. This has a number of personal benefits for me, but I also knew that I was going to need an office to work out of. Working from my dining room table, or from the sofa, was going to be problematic for me for the following reasons:
- I need a transition between work and home – I knew that if I worked from inside my house, I was not going to be able to easily do the mental switch from “I’m at work” to “I’m at home”. “Home” and “Work” was inevitably going to blur into some mishmash that I mentally dubbed “Hork”. That did not sound pleasant, either for me or for my family.
- I need quiet – Like a number of homes, mine is occasionally very noisy. This isn’t necessarily a bad thing, that’s just the way it is. Very likely, there were going to be numerous occasions when I needed quiet when working and what was happening in my home was not quiet at all.
- I need room for work-related stuff – Where I worked was also going to be where I was going to use and store my work-related gear. For my own peace of mind, I didn’t want to store my work-related equipment near where my pets and younger members of the family would have access to them.
- I need to set up work-related equipment on a permanent basis – For various reasons, I like working on a desktop workstation, with attached displays, keyboard and mouse. I also like not having to constantly disassemble and re-assemble my desktop and its attached peripherals, which means I need a place where I can set them up and trust that they’ll be able to stay there on a long-term basis.
With all of those needs in mind, I decided to go the route of having a purpose-built office constructed for my work needs. For more details, see below the jump.
At the beginning of November, I made the following announcement via Twitter:
Time has marched on and today is my last day at the Howard Hughes Medical Institute’s Janelia Research Campus. I wanted to take the opportunity to express my gratitude to the good folks who work there and to my management in particular.
Since 2011, I’ve spoken at a number of Mac IT conferences on a variety of topics. The ability to do so wouldn’t have been possible without the generous support that I received from HHMI. I’ve also had complete freedom when it comes to the writing I’ve done on this blog and elsewhere, which has been a huge boon to me both professionally and personally.
I look forward to continuing to both speak and write as part of my move to SAP, where I’m joining a great team doing amazing things. However, I’ll never forget that it was HHMI’s unstinting support that made it possible to begin with. Thank you.
One of the practices that has historically helped Macs fit better into enterprise environments has been to bind Macs to Active Directory (AD) domains and use AD mobile accounts, using either Apple’s own AD directory service plug-in or a third-party product like Centrify. However, this practice has meant that the password for the mobile account is being controlled by a service located outside of the AD-bound Mac. This has led to problems in the following areas:
With the recent availability of tools like Apple’s Enterprise Connect and NoMAD, it’s now possible to provide the advantages of being connected to Active Directory to your Mac without actually having to bind your Mac to an AD domain. This has led to more environments not binding their Macs to AD and using either Enterprise Connect or NoMAD with local accounts.
With local accounts, all password management is done on the individual Mac. This means that problems with keychain and FileVault password synchronization are vastly reduced because the password change mechanism for a local account includes updating both the keychain and FileVault 2 automatically with the new authentication credentials.
For those shops that have been binding their Macs and using mobile accounts, but want to switch to the new local accounts + Enterprise Connect / NoMAD model, there is an account-related challenge to overcome:
How to transition from an AD mobile account, where the password is managed by AD, to a local account, where the password is managed by the individual Mac, with the least amount of disruption for your users?
To assist with this process, I’ve developed a script that can take an existing AD mobile account and migrate it to being a local account with the same username, password, UID, and GID. For more details, see below the jump.
Providing new installs of macOS, or upgrading to newer versions, can be a challenge in many Mac environments. Apple’s OS distribution model is focused around the Mac App Store (MAS), which may not be an option for a number of managed Mac environments. The MAS-distributed OS installer also does not include the option of adding additional third-party packages to the OS installation process; it only installs the software that Apple itself includes in the OS installer.
To address these needs, an open-source tool named createOSXinstallPkg is available. createOSXinstallPkg allows you to create an Apple installer package from an “Install macOS.app”. You can use this package for the following:
- Installing OS X or macOS onto an empty drive
- Upgrading existing OS X or macOS installations to a newer version of the operating system
The advantage of using this tool is that a number of system deployment tools for Macs can deploy the installers created by this tool, allowing OS installations or upgrades to be performed by the system management tool already in use by a particular IT shop. One great thing about using this tool is that createOSXinstallPkg will create an installer package that either installs a stock copy of either OS X or macOS, or you can add additional packages to the stock OS install.
When adding packages, there are a couple of guidelines to keep in mind:
- There is about 350 megabytes of free space available in the OS installer. This is sufficient space for configuration or bootstrapping packages, but it’s not a good idea to add Microsoft Office or similar large installers.
- The limitations of the OS install environment mean that there are a number of installers that won’t install correctly.
In particular, packages that use pre-installation or post-installation scripts may fail to run properly when those packages are run as part of the OS installation process. To help work around this limitation, I’ve developed a solution which I’ll be discussing later in the post. For more details, see below the jump.
In my shop, we’re not currently using Apple’s VPP program for purchasing applications from the Mac App Store (MAS). However, we do want to make it convenient for our users to be able to access and install some commonly used applications which are available from the App Store. Casper 9.4 and later natively supports providing access to MAS applications, but this approach is more focused on VPP-purchased applications. In my shop’s case, our customers are more likely to purchase apps from the MAS using Apple’s consumer payment model and then get reimbursed.
To help with this, I originally used a process similar to this one developed by Bryson Tyrell. I wanted to make the process more modular though, where I only needed to supply a URL from the MAS and have a scripted solution handle the rest. For more details, see below the jump.