Windows Autopilot is the modern way to provision computers, but hey… we’ve been provisioning computers for years. If it ain’t broke, why fix it? Well… there are a couple of reasons but the big one is this: Do you have any staff working from home? If you do, this is going to help.
I don’t know where you’re all at with this, so I’m going to start with an overview of the different ways new PCs are typically deployed in a business, and where Autopilot fits into that. Think about what needs to happen to get a computer from the state it arrives in its box, to one an employee would actually use. It needs to get joined to the corporate network. In most cases that means joining it to an Active Directory domain so you can actually log in on it. It may also get device certificates installed, and remote access configured so you can use it out of the office.
You probably need some applications to be installed. Typically some flavour of Microsoft Office, and whatever apps you need for your industry and job role. Your company’s polices are going to be enforced. These will configure the operating system and applications to make it easier for you to get access to your stuff, as well as making sure your system is secure.
On that front, you may also have some auditing tool, remote support, and hopefully some form of antimalware installed and configured to report back to a central management platform. You’ll need the latest updates for all of that as well, and then you’re ready to go.
An administrator could set all of this up by hand, but that could easily take a couple of hours, and if they have lots of computers to deploy it will quickly become impractical. For this reason, companies will generally use an automated process such as an image or a task sequence.
For an image what they basically do is they set the first computer up by hand as I described, get it just the way they want it, then they run it through a process that strips away any identifying characteristics and takes a copy of its hard drive in this now-generic state. That hard drive copy is the image.
The same image can now be installed on tens, hundreds, or thousands of computers - blowing away whatever is there and replacing it with your standard build. Each computer will inherit all of the same applications and settings from the reference copy. When they’re first started up they’ll run through a setup process that recreates new identifying properties for that individual system. Then they get joined to the corporate network and you’re ready to go.
This saves a huge amount of time. Instead of spending a couple of hours per computer, you spend a couple of hours on the first one, and every subsequent device needs a couple of keypresses.
But images aren’t perfect. They’re great if every computer needs to be the same, but chances are you’ll find that different people need slightly different builds with different applications. OK, so you need a new image for each build. If any of the requirements change, you’ll need another new image. Then what about updates? The image is frozen in time, so as time progresses you’ll be deploying computers that are increasingly out of date – meaning more work after the image is installed. This typically means that for every image you’ve made, you’ll need to periodically update them to create newer images with all of the current software updates.
A more flexible approach is to use a task sequence instead. This deploys a more minimal base image that is common to everyone, and applies the rest of your configuration as a series of automated installation steps after the initial image is complete.
If one team needs a different application, that’s fine. A slight tweak can be made to the task sequence to install that additional application on their computers. What about updates? It’s another step in the task sequence. This approach allows you to chop and change the desired build without needing a whole new image each time.
The most obvious downside to using a task sequence is that it’s slower. With an image, you deploy the image and you’re done. With a task sequence, you deploy the image, and then you still have to do everything else. The important point, though; is that this is automated, so whilst it might take quite a while to finish, you can just leave it to run. It’s not taking up anyone’s precious time.
There is another downside to task sequences, though. Complexity. With an image it pretty much works or it doesn’t. If the refence computer worked then the others should, too; because they’re essentially just duplicates. With a task sequence you’re customising the build process, and will end up with multiple different results from the same base image. This requires more skill to get right, and more testing to make sure it works.
I have on a number of occasions seen IT departments spend so much time managing their task sequences that it ends up being more laborious than the collection of dumb images they replaced.
Limitations of Images & Sequences
There is one limitation that both of these approaches share. Someone needs to do something. Even if you give your image to the manufacturer and have them apply it straight out of the factory, someone still needs to join that computer to the corporate network after it has been received, so that your employee can use it.
This is typically done by the IT department if not as part of the build process, because most companies don’t let anyone just join random devices to the network. This can quickly become a bottleneck because it tends to mean that regardless of where your staff are located, every device ends up getting shipped to the IT department first, so they can prepare it and send it on.
I have for many years found this process to be frustrating. The best way to explain why, is to tell you a story. This is my experience as a recipient of a computer that had been prepared using a task sequence that hadn't worked too well.
I received my new computer, and noticed a few issues. Firstly, my keyboard was outputting incorrect characters because it was set to American English instead of… English… English. Secondly, the touch input wasn’t working, because the required language pack was missing. Thirdly, some features were missing which turned out to be because Windows was a couple of versions behind, making it older than the device it had replaced. This was just the tip of the iceberg, I’m afraid.
I’m sure you’ll find this hard to believe, but sometimes… I get grumpy. When I have to spend more time fixing a computer that has been through its task sequence than it would have taken me to set it up by hand from scratch… I do get grumpy.
I gave the guy who built it a bit of a hard time. He gave me a sob story about how Microsoft had cocked up the firmware on the latest batch and the task sequence was crashing, and they barely had the current process working, and they didn’t have time to even attempt to get it working with the latest version of Windows.
How Autopilot Helps
His complaints were valid, but here was my point. The computer arrives with a current version of Windows on it. With all of the right drivers on it. With the right language packs installed. It was in a better state before it went through the task sequence! Wouldn’t it have been easier to just take it out of the box and join it to the network?
Of course they proceeded to tell me about all of the settings, the applications, the security requirements that need to be fulfilled before they can just hand it over… but wait a minute! Forget a new computer. How do we manage that for an existing computer?
It’s all automated. If it’s missing software it’s either installed automatically or made available at the user’s discretion, depending on what it is. If the computer is found to be out of conformance with policy, those non-conformances are automatically remediated. It’s the same for updates. We’re not rebuilding every single computer every time an update comes out, are we? So, actually… if you just took that computer and joined it to the network as it came out of the box… it would in a short space of time be picked up as non-conformant and subsequently remediated. The missing software would be installed, too. So if all of this stuff here is automated anyway as soon as it’s on the network; and all of this stuff here is configured and working out of the box… this is the gap. The critical step that actually needs to happen is it needs to be joined to the network – the domain. The rest of it is basically a nice-to-have. With the right automation in place for day-to-day management; if you complete that one task at build time you will end up with a working computer that conforms with your corporate policies.
That’s where Windows Autopilot slides in. Windows Autopilot bridges that gap. It automates that one step, allowing the rest of your automation to kick in, and take a computer fresh from the box it arrived in, to fully working and compliant; without you in the IT department having to do anything. I mean, yeah. You need to set up the management side, but you never need to touch the computer itself.
Here’s how it works. As an administrator you will configure your deployment profiles that define how computers are to be configured. When you buy your computers, you can have them delivered directly to your employees – wherever that may be. They might be in a different office, or at home. It doesn’t matter. When you buy the computer, the vendor also uploads a copy of their hardware IDs to Microsoft. Your employee opens the box and starts up their new computer – just as they’d do with a personal device. When it connects to the internet during the initial setup wizard it checks online to see if its hardware ID has been registered with Microsoft. If it has, then the computer downloads the appropriate deployment profile for your organisation, prompts the user for their corporate credentials, and automatically enrols itself into your management platform.
It will be joined to your Azure AD, and can also be joined to an on-premises Active Directory infrastructure. You can use Intune or Endpoint Manager or whatever they call it today to deploy all of your settings and applications, or you can push out a VPN and let an on-premises system take over.
Autopilot is really that initial bootstrap program built into the Windows setup, that allows the device to connect to the cloud, discover that it belongs to your organisation, and get it over that initial hop from a generic Windows computer in a box to a device that is fully managed by your systems.
From that point, your existing management platform can kick in and do the rest. It can also be used to reset a device. So if the user is having problems with it, you can tell them to do a factory reset and the Autopilot process will kick back in – effectively allowing the computer to be rebuilt remotely. You can actually mix and match Autopilot with more traditional image-based deployments.
Autopilot’s main trick is using the cloud to convert a new device into a managed device, as a sort of self-service end-user experience rather than needing the IT department to do it or the user to be in a particular office. The fact that you can then deploy applications to it over the internet is a useful side effect. If you have a really heavy set of applications to deploy, this might not be ideal to do over the internet. In this case you can use a pre-provisioned deployment to combine Autopilot with a traditional imaging process.
How you might do this in practice is to create an image with all of your applications preinstalled, and have the vendor apply the image straight out of the factory. They ship it directly to the end-user, Autopilot kicks in and onboards it to become a managed device, but you don’t need to download all of the applications because they’re already installed.
In terms of licensing for this… Yeah, come on, you didn’t think it was free, did you? It’s Microsoft – there’s always a subscription somewhere.
It needs to be Windows 10 Pro or higher. It won’t work with Home Edition, but that would be pointless because if you’re using Home Edition then you can’t really manage it anyway. You will also need an Azure AD Premium subscription. Most common scenarios will require at least Enterprise Mobility and Security because a lot of the functionality is designed to be fulfilled by Intune. Endpoint Manager. Whatever they’re calling it when this goes live.
That’s not strictly necessary, though. If you have Azure AD Premium subscription and a compatible 3rd party device management platform, you can use that instead. Of course all of these licences are packaged up in a number of Microsoft 365 subscriptions as well.
So that’s Windows Autopilot in a nutshell. It can be used with Intune as a lighter alternative to the likes of SCCM, and it can also coexist with and integrate with heavyweight platforms like SCCM. The big reason for using Autopilot is to get that zero-touch, remote provisioning. Yes, it’s an extra cost on top of Windows; but if you’ve bought into the Microsoft cloud ecosystem already, chances are… you may already be paying for it.
Question for you guys: are you using Autopilot? Has it made life easier for you? Have you come across any gotchas? Leave all that good info in the comments.