Ignite 2018 – My wrap up

First of all, what an amazing experience to attend Microsoft Ignite 2018 in Orlando. All started off with a keynote by Satya Nadella followed by general announcement sessions and technical deep dive sessions. The key message was about “Tech Intensity”. This is described by changing your cultural mindset and your processes. Ultimately leading to a better digital transformation by tearing down silos in your organization and bringing teams and technology closer together. Creating close feedback loops to generate advantage of valuable insights. Technology will cover every aspect of our life sometime and it is time now to start to align yourself to this. Time to disrupt yourself, modernize your own business model. Everyone has to find ways to leverage technology to optimize their businesses otherwise others will have a significant advantage in a short amount of time. It’s time for a culture of adoption.

The biggest announcement for me as an Enterprise Mobility MVP is by far the Win32 app support in Microsoft Intune.

IntuneWin32Announcement

Microsoft delivers the functionality to wrap your apps (all kind of installers like setup.exe etc) in a container (zip) format and allows distribution via Intune. This is a game changer for Microsoft Intune as it was one of the missing parts and blocker for customers to fully adopted the modern management approach. Customers who are wiling to modernize and are going for a full modern management approach are now able to install their Line-of-business applications directly via Intune. This functionality is powered by the Microsoft Management Extension (see my deep dives Part 1 and Part 2) which was used for PowerShell script execution with Intune already in the past. I will provide some technical deep dive insights into the Win32 capabilities in the following days.

For Windows Autopilot they announced Hybrid Azure AD join and the capability to harvest Autopilot data. The Hybrid Azure AD join might be good for some people but I really encourage everyone to check out Azure AD join as it also supports accessing local AD resources with Kerberos authentication. Harvesting Autopilot information of existing devices in an easy way is great and simplifies the steps to a modern infrastructure where all devices even in a reinstall/reset scenario are using a simplified setup powered by Windows Autopilot. For Intune they delivered great Sessions regarding the availability of Intune Security Baselines and Desktop Analytics. The Intune Security Baselines are provided to fill configuration gaps which companies currently see when transitioning from a local AD and Group Policy environment to a Azure AD/MDM environment. The goal is to have all required security settings available in the MDM environment to easily transition. In addition to the Security Baseline settings, Intune gets support for Administrative Templates to further simplify the configurations of policy settings for Office and Windows. It frees us to handle the complex way of admx ingestion by providing the most needed settings in the portal. Not only the settings administration is made simple they also released the Intune PowerShell SDK to easily automate tasks within Intune. With Desktop Analytics we get tighter integration of the telemetry data from Windows Analytics into Intune to build piloting rings based on telemetry to ensure good coverage of your LOB apps in the piloting rings.

A lot of changes happening in the IT world right now and Microsoft builds out his platforms to be even more mature to support us in every aspect. This starts with the Microsoft 365 story by bringing real cross platform functionality like Information Protection availability in every product like Outlook, Word etc. for Windows, iOS/macOS, and Android. They also announced the OneDrive feature “Files on Demand” for macOS now. There is a very big focus on Security on everything what Microsoft is doing right now. More than 3500 full time security professionals are working at Microsoft to power the security platforms also with the help of AI to generate insights for the Intelligent Security Graph. To fight the world of passwords they announced Azure AD password less sign-in. A very good simplification of the portals is coming up also to provide consistency in the handling. This simplification includes a common URL scheme like devicemanagement.microsoft.comadmin.microsoft.com, security.microsoft.com, or compliance.microsoft.com. Even a Windows Virtual Desktop in the cloud was announced. Another highlight to mention is that Microsoft will bring up a new offer called Microsoft Managed Desktop where Microsoft will completely manage your devices. In the Office world we got Microsoft Ideas which helps you to find great layouts and even the interesting data in your Excel spreadsheet powered by AI.

Beside these great announcements I’ve taken the chance to meet with a lot of the Product Group members and had great conversations with them. Also I met a lot of my fellow MVPs which was a great experience. Really looking forward for the next one in Orlando at November 4-8. If you like to pre-register follow this link.

See you in Orlando next time! 👍

 

 

Automation of gathering and importing Windows Autopilot information

Complete process automation of gathering and upload of a device Autopilot information to the Windows Autopilot service with an Azure Automation Runbook.

AzureAutomation
On one of my previous blog post Gather Windows 10 Autopilot info in azure blob storage during wipe and reload, I described the gathering of Autopilot information during operating system deployment in a wipe and reload scenario with MDT. Just a short recap of the problem and my initial solution:

If we purchase a new device, the OEM vendor takes care of installing Windows 10 with a signature edition or provisioning ready installation including all necessary drivers. If we buy new hardware the information for Autopilot can be synced into our tenant from the OEM vendor (Lenovo is already capable of doing that and others will follow). We will get the device information in Intune and we can start to assign an Autopilot Deployment Profile and start to enroll the devices.

What if we have a bunch of Windows 7 devices in the environment?

A way to handle this is that we are playing the role of the OEM vendor and do the install of a Windows 10 signature edition on the existing Windows 7 devices, gathering Autopilot information, and let Windows 10 start in the Out of Box Experience (OOBE) again for user enrollment. Depending what is available we can use ConfigMgr or MDT for this. My example uses MDT.

Now imagine a situation where a rollout team is preparing a lot of machines. We would end up in a lot of .csv files on different devices. To make this a little easier for IT to import the hardware information of new devices into the Autopilot service, we build up the following logic:

  1. Gather hardware information via PowerShell Script Get-WindowsAutoPilotInfo during wipe and reload
  2. Upload .csv file via AzCopy to an Azure Blob Storage
  3. Gather .csv files from Azure Blob Storage and combine them into a single combined.csv file
    This was a manual step in my previous solution
  4. Upload combined .csv file to Autopilot and assign Deployment Profiles
    This was a manual step in my previous solution
  5. Device can be delivered to the end user like it where shipped by the OEM vendor

You can read more about the initial solution here: Gather Windows 10 Autopilot info in azure blob storage during wipe and reload

This blog post is all about automating these two steps – gathering and upload of Autopilot information to Intune.

Architecture

First, I will explain the architecture and how it works and then I’m going to describe the way to implement it. The overall architecture is based on an Azure Automation Runbook and looks like this:

AutoPilotImportArchitectureOverview

The new procedure including the enhanced logic for a complete automation of the import is now as follows (modified steps for complete automation):

  1. Gather hardware information via PowerShell Script Get-WindowsAutoPilotInfo during wipe and reload scenario
  2. Upload .csv file via AzCopy to an Azure Blob Storage
  3. Gather .csv files from Azure Blob Storage and combine them into a single .csv file with the help of a scheduled Azure Runbook
  4. Upload combined .csv file information to Windows Autopilot Service via PowerShell Script WindowsAutoPilotIntune running in an Azure Automation Runbook 
  5. Cleanup Azure Blob Storage (delete all .csv files of successfully imported devices and delete all .csv files of already imported devices)
  6. Generate import notification and summary and post it to a Microsoft Teams channel
  7. Autopilot information is available for the OOBE user enrollment scenario with Autopilot. The Autopilot profile gets automatically assigned by a dynamic AzureAD device group membership.
  8. Device can be delivered to the end user like it where shipped by the OEM vendor

I’ve still chosen the copy via AzCopy of individual .csv files to the Azure Blob Storage approach as we can then limit the access quite well via shared access signature and we can easily limit permission to write blob objects only. No need to provide credentials or Blob Storage keys on the client side. Sure, we could build up a more advanced HTTP endpoint to gather device information, but this approach is quick and simple. I’m pretty sure that the complete solution of this automation task is something which we do not need in future when devices are Windows 10 migrated and we then buy Autopilot ready hardware only.

 

Guide to build the new solution

The Autopilot Graph API is an API with focus on batch processing. This means we import new device information into a kind of staging area and Windows Autopilot service will pick up the new device information and starts importing it. This process varies in the amount of time it takes and we have to check the status of all devices to get the import result. As soon as the devices are processed we can clean up the staging area and the import is done. Normally we would do this by wrapping the Graph API calls (REST) into some PowerShell functions and build the logic for the described process. Luckily Microsoft released a new PowerShell Module WindowsAutoPilotIntune (thx to @mniehaus) based on the Graph API to import new AutoPilot information into Intune.

In my previous blog post about Process automation for Intune and Azure AD with Azure Automation, I created a Runbook to delete old devices from Intune via Graph API and demonstrated how to do an unattended authentication within the Runbook. All the details how this can be achieved are explained there. Please follow the guide to setup the Azure Automation account. I use the same unattended authentication technique to utilize the PowerShell Module WindowsAutoPilotIntune to import the device information into Autopilot service in the following Runbook. Additionally, the Runbook is built to protect concurrent execution (thx to Tao Yang, I used his implementation for it) to ensure a sequential processing and to keep track of current running imports. If we would design this as a concurrent solution it would get much harder in terms of monitoring and reporting in the end. In addition, there is a max import of 175 devices into the staging area of the API which we are taking care of by limiting the Runbook import to 175 devices during one run.

If the requirements are implemented based on the previous blog post (especially the Intune native app and the automation service account) we simply need to extend the permissions to “Read and write Microsoft Intune configuration” of the automation account which was created during the setup of Process automation for Intune and Azure AD with Azure Automation in section Building the solution.

IntuneNativeAppPermission

For the concurrent execution protection, we need our automation credential to have Reader permission and for Blob Storage access we need Contributor permissions on the subscription. As the result we grant Contributor permission to the automation account:

SubscriptionPermissions

Finally, we can implement the complete Runbook which can be found on my GitHub account here:

https://github.com/okieselbach/Azure-Automation/blob/master/Import-AutoPilotInfo.ps1

The Runbook is written in PowerShell and follows the logic described in the beginning of this post – section architecture.

Create a PowerShell Runbook and paste-in the code.

CreateNewRunbook

PowerShellRunbook

To make sure the Runbook successfully runs we need to define some additional variables. I assume that the IntuneClientId and Tenant variable are defined as described in the previous blog post.

AzureAutomationVariables

Additional variables needed for Azure Blob Storage access:

ContainerName: <your-blob-storage-containername>
StorageAccountName:
<your-blob-storage-account>
StorageKey: <your-blob-storage-secret-key> * as encrypted variable

Additional variables needed for Microsoft Teams Channel notification:

SubscriptionUrl: <your-subscription-url>

The subscription URL can be found as shown below. Please do not copy the /overview end of the URL. The URL should end with the subscription GUID only (like highlighted):

AzureSubscriptionsUrl

TeamsWebHookUrl: <your-ms-teams-webhook-url>

Open MS Teams and choose a Channel where the Autopilot notification from the Azure Runbook should be displayed. Click on the three dots and choose Connectors:

TeamsConnectors

Look for Incoming Webhook and click Configure

incomingwebhook.png

Type in the details and upload a icon and click Create

ConnectorDetails

Finally copy the Webhook URL:

ConnectorWebHookUrl

Paste it into the Azure Automation variable TeamsWebHookUrl and set encrypted value to Yes

TeamsWebHookVarEncrypted

This is necessary to get Microsoft Teams notifications with some statistics and information to troubleshoot errors. Below is an example of an import notification in Microsoft Teams:

AutoPiloImportTeamsNotification

We have some statistics, detailed error list with device information and a link to the Runbook itself in Azure. All based on Adaptive Cards JSON code. This can be easily modified to fulfill personal needs. Have a look at the Adaptive Cards Designer (https://acdesignerbeta.azurewebsites.net) for experimenting with layouts and adjust the Runbook code.

 

Enhanced client-side script part

I my previous blog post about Gather Windows 10 Autopilot info in azure blob storage during wipe and reload I have described how to setup the Azure Blob Storage to gather the “<hostname>.csv” files during MDT operating system installation. Please follow the previous guide to setup the Azure Blob Storage and the integration in MDT.

I have an enhanced version of the gather script now which can be found on my GitHub account and is also shown below. The enhanced version does not have the dependency on AzCopy.exe (incl. dependency files) and Get-WindowsAutoPilotInfo.ps1 in the script directory. If they are not available, they are downloaded from an additional Blob Storage container named resources. The additional container resources must be created and the AzCopy.zip and Get-WindowsAutoPilotInfo.ps1 must be uploaded there to successfully run the script:

BlobStorageResources

BlobStorageResourcesContent

The enhanced Get-WindowsAutoPilotInfoAndUpload.ps1 version:

Replace ZZZZ with your Blob Storage account name and ZZZZ with your SAS signature in the script above. See here Delegating Access with a Shared Access Signature for more SAS signature details.

This version can also be used to be executed via Microsoft Intune Management Extension to run it on existing Windows 10 devices. It is possible to collect all targeted device information and the Runbook will import the device information. Already imported devices will be skipped. This way we can make sure every device is imported to the Autopilot service.

SideCarAutoPilotScript

 

Sample output of the Runbook

AutoPiloImportRunbookOutput

If a device information <hostname>.csv is successfully imported the .csv files will be deleted from the Azure Blob Storage. In case of an error it will be left there untouched but reported via Runbook output and Teams notification. There is one case where the Runbook will delete the .csv file from the Azure Blob Storage also. This is if the Runbook detects an error 806 – ZtdDeviceAlreadyAssigned. In that case we can delete the .csv as it has no consequences. In every other error situation someone needs to resolve the error manually. The Teams notification is only generated if there is some device information in the Azure Blob Storage. The normal procedure would be if operations gets an import notification it should check and in case of errors they should be resolved manually.

AutoPiloImportTeamsNotification

 

Important observations during testing

The Autopilot import via Graph API takes some time and it may also timeout. So, it is not a typical request response REST API in this case. Remember all device information is staged and then the devices are monitored for their success or error state. This led to situations where I had to wait up to 1 hour to get a successful import, even when the UI in Intune tells us it takes up to 15 min. So be prepared that it might take longer or fails during the run. I chose this approach as it is the same procedure as the Azure Portal does the import. In fact, we really automated the import process in Intune but did not invent a complete different process which might cause different problems of device tracking or concurrency and so on. Depending on the use case you can run the Runbook on a recurring schedule. My normal use case for this scenario is to support people who are preparing older devices when they are getting reinstalled from Windows 7 to Windows 10. If someone expects immediate availability of Autopilot information after such a reinstall this might be problematic as the API is not designed for this. Another fact is that the Autopilot Deployment Profile assignment using Intune does take some time also at the moment. I observed that it took several hours sometimes. I suggest to re-arrange the operational processes and hand out reinstalled Windows 10 devices after some additional time and not directly after reinstalling, as this increases the possibility that the Autopilot information is not imported and profile assigned yet.

To run the Runbook on a recurring schedule just go to the Runbook and add a schedule. The max recurrence is limited to every hour.

AddRunbookSchedule

Just define a schedule (max once per hour) and monitor the recent job overview if it works:

AzureRunbookSchedule

The schedule can even be enhanced by using a simple Logic App and not using the Azure Automation Runbook schedule at all:

LogicAppTriggerRunbook

Please see Stefan Strangers post for detailed instructions how to implement a simple Logic App to trigger an Azure Automation Runbook:

https://blogs.technet.microsoft.com/stefan_stranger/2017/06/23/azur-logic-apps-schedule-your-runbooks-more-often-than-every-hour/

In case something goes wrong I have created a second Runbook to clean up the staging area of the Autopilot Graph API. Get it from my GitHub account and run it in case of fatal errors where you might want to clean up the staging area:

https://github.com/okieselbach/Azure-Automation/blob/master/Cleanup-AutoPilotImportedDevices.ps1

Here is a sample output of the Cleanup Runbook:

CleanupRunbookOutput

Further information

Azure Logic Apps – Schedule your Runbooks more often than every hour
https://blogs.technet.microsoft.com/stefan_stranger/2017/06/23/azur-logic-apps-schedule-your-runbooks-more-often-than-every-hour

Preventing Azure Automation Concurrent Jobs In the Runbook
https://blog.tyang.org/2017/07/03/preventing-azure-automation-concurrent-jobs-in-the-runbook

Post notifications to Microsoft Teams using PowerShell
https://blogs.technet.microsoft.com/privatecloud/2016/11/02/post-notifications-to-microsoft-teams-using-powershell

importedWindowsAutopilotDeviceIdentity resource type
https://developer.microsoft.com/en-us/graph/docs/api-reference/beta/resources/intune_enrollment_importedwindowsautopilotdeviceidentity

Autopilot profile assignment using Intune
https://blogs.technet.microsoft.com/mniehaus/2018/06/13/autopilot-profile-assignment-using-intune

Adaptive Cards Designer
https://acdesignerbeta.azurewebsites.net

I published the same article on SCConfigMgr in a more step-by-step guide version, meaning there are not so many cross references to my other articles:

Automation of gathering and importing Windows Autopilot information
http://www.scconfigmgr.com/2018/07/23/automation-of-gathering-and-importing-windows-autopilot-information

 

I hope this can increase your throughput on the way to an Autopilot Windows 10 modern management environment.

When someone finds bugs or problems with the solution let me know and leave a comment. I will do my best to fix them as it should be a reliable part during preparing old devices with Windows 10.

Intune Managed Browser (MAM) with Azure AD Application Proxy and Conditional Access

Recently Microsoft enhanced the Intune Managed Browser experience with Mobile Application Management (MAM) and app-based Conditional Access (CA) a lot. It is integrated into the Conditional Access story as an approved app and supports the Azure AD Application Proxy very well now.

 

What does this allow us to do now?

We are now able to design a solution to publish our internal websites externally with minimal effort and then allow access to it from our mobile devices only by the Intune Managed Browser protected by Intune app protection policy. This ensures the information is safeguarded in our containerized Intune MAM solution. This gives most companies enough trust to actually do the publishing of internal resources for usage on mobile devices and support the bring your own device (BYOD) solution.

Please read the How does Application Proxy work? documentation from Microsoft to get a better understanding what we are going to do in the next section with the Azure AD Application Proxy. The Azure AD Application Proxy architecture is shown in the figure below:

ArchitectureAADAP

One of the nice things is it will not require us to open up any inbound firewall ports. As long as we are allowed to make outbound connections we can publish internal websites easily to external. The solution even supports various authentication scenarios inclusive Single Sign-On (SSO).

 

Here is a walkthrough of a demo setup to show it in action

The walkthrough of the demo scenario should get you a deeper understanding of the new possibility. Assuming we have some internal websites e.g. intranet and expenses and they are available in the internal network only. To simulate that, I have setup an IIS server hosting the two simple websites, intranet and expenses within a private network. They are reachable on the IIS server via http://localhost for intranet and http://localhost:81 for expenses. In addition I have a link from intranet pointing to expenses website (link target is: http://localhost:81, compare screenshot with html source code). I built the two demo sites to also demonstrate link translation with Azure AD Application Proxy later on.

AADAPIntranetSites

AADAPIntranetSitesHtml

 

How do we get the internal websites published now?

First of all we need to switch off the IE Enhanced Security Configuration on the Windows Server otherwise we are not able to complete the login prompt of the Azure AD Application Proxy during setup procedure. Then we are downloading the Azure AD Application Proxy on our demo IIS server and run the msi installer. It’s a very lightweight installer and the only thing we need to provide is the Global Administrator credential during setup to finish the process.

AzureADAppProxyDownload

The next step after installing the connector is to enable it by clicking Enable application proxy. After it is enabled the UI switches to “Disable application proxy” (shown in screenshot as step 3). Once enabled we have the Connector group default and our server listed there. It is possible to install more then one connector and build connector groups to support better reliability of the publishing (in fact this is recommended). The connector does not need to be installed on the IIS as I have done it in my demo setup, it should be on a dedicated Windows Server 2016 for example. I needed to run it on the IIS for simplicity of my setup and to use the internal address of http://localhost during publishing later on.

The official documentation for the Azure AD Application Proxy from Microsoft is found here https://docs.microsoft.com/en-us/azure/active-directory/active-directory-application-proxy-enable or you follow the link on the application proxy blade “Learn more about Application Proxy“.

With an up and running connector we can publish the websites now. It is the best to follow the detailed step-by-step guide from Microsoft https://docs.microsoft.com/en-us/azure/active-directory/application-proxy-publish-azure-portal and make both available. I published my both sites as an Enterprise Application as described and used no custom domain, but enabled link translation in the application body.

Published internal websites:

AzureADAppProxyPublishedWebsites

Details of the website intranet with internal URL http://localhost

AzureADAppProxyIntranet

Details of the website expenses with internal URL http://localhost:81

AzureADAppProxyExpenses

Now I can open up my published intranet from external and the intranet link originally pointing to http://localhost:81 was replaced by the application proxy because we enabled link translation on the application body (compare screenshot below). This works only if we publish both websites as the application proxy must find a published website for http://localhost:81 to do the translation.

AADAPIntranetSitesExternal

In a real world implementation I would recommend to use a custom domain for publishing to maintain your links. For example if we have mydomain.com as Active Directory (AD) and I publish via Azure AD Application Proxy with the custom domain mydomain.com I can reach the website internally and externally with the same URL. To set this up follow the instructions here:

Working with custom domains in Azure AD Application Proxy
https://docs.microsoft.com/en-us/azure/active-directory/active-directory-application-proxy-custom-domains

 

Securing our Intune mobile apps with Intune application protection policies

Now we need to add a MAM policy – app protection policy to secure the Intune Managed Browser and Mobile Outlook. To do that we open Intune > Mobile apps > App protection policies > Add a policy

MAMPolicyAdd

After adding the policy we make sure Outlook and the Managed Browser is in the targeted apps and of course we adjust the individual Policy setting to meet our corporate standard and to realize the containerization (e.g. let apps only transfer data to other managed apps, encrypt data and so on…).

MAMPolicyTarget

For the policy setting we need to make sure the setting Restrict web content to display in the Managed Browser is set to Yes. This makes sure internal links in emails are opened in the Intune Managed Browser. Even better because of the Azure AD Application Proxy publishing we make sure that internal links get translated and opened successful in Intune Managed Browser. We will do that by assigning an additional app configuration policy in the next step.

MAMPolicySettings

As last configuration we assign the app protection policy to our AAD user group we want to target.

To configure the Intune Managed Browser to work hand in hand with the Azure AD Application Proxy and translate internal URLs to the published URLs we need to configure an app configuration policy for the managed browser.

AppConfigurationAppProxyRedirection

AppConfigurationAppProxyRedirectionTarget

Now the important piece of configuration is to configure:

Key: com.microsoft.intune.mam.managedbrowser.AppProxyRedirection
Value: true

The screenshot below does not display the complete string!

AppConfigurationAppProxyRedirectionSetting

Again as last configuration we assign the app configuration policy to our AAD user group we want to target.

 

Controlling access to the internal websites with app-based Conditional Access

Now we need to make sure our internal published website can only be accessed by Intune approved apps which are protected by app protection policy.

To do that we create the following Conditional Access policy in Intune or in the Azure AD portal. We assign our AAD user group, target All cloud apps, and include iOS and Android devices, and select Browser and Mobile apps desktop clients

CAMAMBrowserAndDesktopApps

As access control we grant access for approved client apps by choosing the option Require approved client app

CAMAMApprovedApps

 

How about the user experience?

Everything is in place and we assume someone in the company sent us an internal link to the new intranet site http://localhost. We open up mobile Outlook on iOS in this example:

OutlookIntranetMail

If we now click on the internal link, Outlook is configured to Restrict web content to display in the Managed Browser and will open the link in the Intune Managed Browser for us. The Intune Managed Browser is then instructed for AppProxyRedirection = true. This will redirect us to the external published URL instead of the internal URL as shown below and shows us the demo intranet site:

ManagedBrowserIntranet

Even the link within the demo intranet site is translated and will open the published demo expenses website:

ManagedBrowserExpenses

To make sure that the published intranet site is only accessible by the Intune Managed Browser we open up Safari and open the published intranet site by typing in the external URL and we will check if access if blocked:

SafariBlockInternalWebsite

As we see the access is blocked and we get a nice feedback to use the Intune Managed Browser instead and we can directly use the blue link button to open the Intune Managed Browser.

 

Summary

We have seen how to publish internal websites via Azure AD Application Proxy easily. Then we configured our mobile apps to use an Intune app protection policy and instructed the Intune Managed Browser to use Azure AD proxy redirection to translate internal links and open them successfully. We achieve protection of the published internal website to prevent data leakage.

 

Further information

The Intune Managed Browser now supports Azure AD SSO and Conditional Access!
https://cloudblogs.microsoft.com/enterprisemobility/2018/03/15/the-intune-managed-browser-now-supports-azure-ad-sso-and-conditional-access/

Better together: Intune and Azure Active Directory team up to improve user access
https://cloudblogs.microsoft.com/enterprisemobility/2017/07/06/better-together-intune-and-azure-active-directory-team-up-to-improve-user-access/

Manage Internet access using Managed Browser policies with Microsoft Intune
https://docs.microsoft.com/en-us/intune/app-configuration-managed-browser

How to create and assign app protection policies
https://docs.microsoft.com/en-us/intune/app-protection-policies

 

My advice to all, give it a try and start to play with MAM and app-based Conditional Access as it might be a quick win for your company and finally allow the usage of BYOD as company data can be protected very well in this scenario.

Happy publishing and protecting 🙂

Process automation for Intune and Azure AD with Azure Automation

IntuneAndAzureAutomationCloud managed environments benefit from the idea of software as a service, you don’t have to think about upgrading or maintenance of the infrastructure itself. But often we need to automate the tools itself. A very good example here is when an employee quits his job, than we need to trigger a lot of processes like disabling the account, retire of the device(s), wiping of the devices, sending some notes to various people and so on. Another example might be the cleanup of devices within Intune and Azure AD as they get stale over time and they are not used by users anymore.

 

Introduction

In the following blog post I like to show how to automate the process to delete old devices from Intune and Azure AD without the help of services from on-premises like servers running scheduled scripts. The established cloud workflow can be used by the service desk to quickly delete a device in both involved services Intune and AAD. After seeing a lot of environments where devices are being cleaned up in Intune and left in AAD, I thought its beneficial to show how to easily automate this with the Microsoft cloud solution Azure Automation. If the basics are built it’s just a matter of combining new tasks within a Runbook to build other workflows which are worthwhile in your environment.

I will show how to setup the Azure environment and create the first Runbook. A Runbook is the actual workflow which runs the PowerShell script. The Runbook will do an unattended authentication against the Inunte API via Microsoft Graph to manage Intune. We do not have a PowerShell module for Intune at the time of writing therefore we use the Intune API in Microsoft Graph. For the AAD operations we use the AzureAD module to perform the management tasks.

 

How to do unattended authentication with the Intune API?

The problem with the Intune API and Microsoft Graph is, that we can’t authenticate as an application as this is not supported at the time of writing. See section here Intune Device Management permissions > Application permissions: None.
https://developer.microsoft.com/en-us/graph/docs/concepts/permissions_reference#intune-device-management-permissions.

We need to authenticate as an user (service account). This requires additional credentials and a secure storage of them to automate. Microsoft has a good guide how to set up an Azure application to support this scenario: How to use Azure AD to access the Intune APIs in Microsoft Graph. One aspect is that the Microsoft How-To guide will end up in a scenario which still prompts for credentials with a input form. This is because of the usage of:

AuthenticationContext.AcquireTokenAsync

For Azure Automation we need to change this behavior a bit to support credentials within our code:

AuthenticationContextIntegratedAuthExtensions.AcquireTokenAsync

We can use the How-To guide or the official GitHub Intune sample scripts which have the following lines of code:

$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority
$platformParameters = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.PlatformParameters" -ArgumentList "Auto"
$userId = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.UserIdentifier" -ArgumentList ($User, "OptionalDisplayableId")
$authResult = $authContext.AcquireTokenAsync($resourceAppIdURI, $clientId, $redirectUri, $platformParameters, $userId).Result

they need to be changed to support our new AcquireTokenAsync call with support to specify UserPasswordCredentials as additional parameter:

$intuneAutomationCredential = Get-AutomationPSCredential -Name automation

$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority
$platformParameters = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.PlatformParameters" -ArgumentList "Auto"
$userId = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.UserIdentifier" -ArgumentList ($intuneAutomationCredential.Username, "OptionalDisplayableId")
$userCredentials = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.UserPasswordCredential -ArgumentList $intuneAutomationCredential.Username, $intuneAutomationCredential.Password
$authResult = [Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContextIntegratedAuthExtensions]::AcquireTokenAsync($authContext, $resourceAppIdURI, $intuneAutomationAppId, $userCredentials);

The credentials will be received from the Azure Automation account in PowerShell via Get-AutomationPSCredential. We will provision the service account credentials securely for the Azure Automation account via Credential assets.

 

Building the solution

The following steps must be followed to build the solution:

  1. Creation of a native application in Azure AD
  2. Assigning permissions to the registered application
  3. Grant permissions (administrator consent)
  4. Create Azure Automation Account
  5. Add Azure AD module to the Azure Automation Account
  6. Add credentials to the Azure Automation account
  7. Add variables to the Azure Automation account
  8. Add Runbook to the Azure Automation account
  9. Edit Runbook
  10. Start and test Runbook
  11. Add Automation Operator

 

1. Creation of a native application in Azure AD

The best description for a native application is found in the Intune documentation for the Intune API here: How to use Azure AD to access the Intune APIs in Microsoft Graph. I will outline the necessary steps to setup the environment.

New application registration

AddAppRegistration

fill out the details and give it a name, create a native application with redirect URI: urn:ietf:wg:oauth:2.0:oob

AppRegistration

in the end a new registered application is available. Important is to copy the application id as we need it in our PowerShell script Runbook later.

RegisteredApp

 

2. Assigning permissions to the registered application

The registered application must have AAD Read and Write permissions, and Intune Read and Write permissions.

RequiredPermissions

AddAPIAccess

AddApplicationPermissionAAD

ApplicationPermissionAAD

 

3. Grant permissions (administrator consent)

Finally we grant the selected permissions to the newly registered application.

GrantPermissions

ConfirmGrantPermissions

 

4. Create Azure Automation Account

Creation of the Azure Automation Account in a existing or new resource group.

AddAutomationAccount

AddAutomationAccountDetails

 

5. Add Azure AD module to the Azure Automation Account

To have access to AzuerAD module we add it via the Gallery, choose Browse Gallery

AzureAutomationAddModule

AzureAutomationBrowseGallery

 

6. Add credentials to the Azure Automation account

Go to Azure AD and create a new user, in my case user automation with Display Name Intune Automation and use a complex password for it.

IntuneAutomationUserRoles

At the moment we need to assign the Global Administrator role as we want to delete devices in Azure AD. This information is based on: https://docs.microsoft.com/en-us/azure/active-directory/device-management-azure-portal#delete-an-azure-ad-device

After user creation we add the credential to the Azure Automation account.

AddAzureAutomationCredential

AzureAutomationCredential

 

7. Add variables to the Azure Automation account

The following PowerShell needs the native registered application ID also called Client ID. Therefore we create a Azure Automation variable IntuneClientId and we need the tenant ID as a variable, we use Tenant as identifier.

AddAzureAutomationVariable

Below an example for IntuneClientId and add your Application ID from above, do the same for Tenant variable and add your tenant ID.

AddAzureAutomationVariableClientId

 

8. Add Runbook to the Azure Automation account

Adding a Runbook with the name Invoke-RetireDevice

AzureAutomationAddRunbook

AzureAutomationAddRunbookDetail

 

9. Edit Runbook

We verify if the Runbook sees all our important information like AzureAD module, Variables and Credentials. After adding the PowerShell script we need to publish it.

EditRunbookDetail

PowerShell script for the Runbook is based on the GitHub samples with the modification to allow non-interactive usage of credentials via Get-AutomationPSCredential and Get-AutomationVariable

$intuneAutomationCredential = Get-AutomationPSCredential -Name automation
$intuneAutomationAppId = Get-AutomationVariable -Name IntuneClientId
$tenant = Get-AutomationVariable -Name Tenant

Now follows the actual PowerShell script with the logic to get the device of the user and delete it form Intune with usage of the automation credentials and variables for client id and tenant. In the end it will use the same credentials to delete the device from AAD also.

 

10. Start and test Runbook

Everything is setup, now it’s time for the first run. Get a stale Intune device you like to retire and start the Runbook.

StartRunbook

The Runbook has two input parameters DeviceName and UserPrincipalName. This is needed to avoid getting duplicate entries for DeviceName. A user should only have a device once. If not we might rethink the PowerShell logic to address this.

StartRunbookDetail

After start of the job we can click on Output

StartRunbook-Output

and get details as defined in our PowerShell script. If everything runs fine you will get the following output:

StartRunbook-OutputDetail

 

11. Add Automation Operator

We add a different user (e.g. service desk operator) to our Runbook as an Automation Operator. This provides the user the possibility to log on to portal.azure.com and start the Runbook but it’s protected from modifications as shown below.

AzureAutomationAccessControl

AzureAutomationAccessControlDetail

AzureAutomationOperatorPermission

 

Recap what we achieved

We have setup Azure Automation to host our PowerShell script in a managed cloud environment which is able to run as job to delete an Intune device and AAD device. In addition we learned the basics of Azure Automation and how to add modules, work with credentials and variables. Usage of unattended authentication to the Intune API is the basis for Intune API usage in Azure Automation.

 

Enhancements

  1. Microsoft Flow
  2. Source Control

Microsoft Flow

I thought it would be nice to enhance the Runbook with a Microsoft Flow to trigger it from my mobile phone. I found the following article which is describing how to do that:

Azure Automation new Microsoft Flow Service
https://blogs.technet.microsoft.com/stefan_stranger/2017/03/30/azure-automation-new-microsoft-flow-service/

Unfortunately as soon as I tried to use it I found that Microsoft Flow does not provide any trigger at the moment for it. The manual trigger as shown in the blog post above is not available for me. Maybe we can provide a nice interface for the runbook in the future via Microsoft Flow.

Microsoft Flow – Azure Automation
https://flow.microsoft.com/en-us/connectors/shared_azureautomation/azure-automation/

Source Control

When working with code it’s important to have a good versioning and a code storage place. For this Azure Automation provides integration with GitHub to link your Runbook source code. How to setup this follow the guide below. I can really recommend it.

Source control integration in Azure Automation
https://docs.microsoft.com/en-us/azure/automation/automation-source-control-integration

 

Further information

Azure Automation User Documentation
https://docs.microsoft.com/en-us/azure/automation/

How to use Azure AD to access the Intune APIs in Microsoft Graph
https://docs.microsoft.com/en-us/intune/intune-graph-apis

Credential assets in Azure Automation
https://docs.microsoft.com/en-us/azure/automation/automation-credentials

Intune Device Management permissions
https://developer.microsoft.com/en-us/graph/docs/concepts/permissions_reference#intune-device-management-permissions

Graph Explorer – Microsoft Graph
https://developer.microsoft.com/en-us/graph/graph-explorer

Another very good guide using Azure Automation with Intune and AAD is here:
Unattended authentication against the Microsoft Graph API from PowerShell
http://www.powershell.no/azure,graph,api/2017/10/30/unattended-ms-graph-api-authentication.html

You want to learn more about Intune Housekeeping with scheduled Azure Automation PowerShell scripts then visit Ronny’s blog:
https://ronnydejong.com/2018/04/11/keep-your-microsoft-intune-tenant-clean-and-tidy-w-azure-automation-graph-api

 

Have fun in automation. Feel free to post your process automation ideas in the comment area below! Thanks for reading!

Gather Windows 10 Autopilot info in Azure Blob Storage during wipe and reload

UPDATE 22/07/2018: New blog post Automation of gathering and importing Windows Autopilot information

The Modern Management strategy is based on Enterprise Mobility + Security and additional services like Office 365. Microsoft created a new SKU called Microsoft 365 for this. To complete the big picture we need some additional services:

The idea is clear, manage the Windows 10 devices like mobile phones. No more Operating System Deployment (OSD) just provisioning and management from everywhere. Everything is powered by the cloud.

A new member in this story is a feature called Windows Autopilot. You can compare this with the Device Enrollment Program as you might know from Apple. It provides a managed way of provisioning with near zero touch. IT is able to control the experience the end user will have during enrollment process. To make all this work we need to gather some properties of the device to identify it clearly. The Autopilot needs the Device Serial Number, Windows Product ID and the Hardware Hash. This information is uploaded to the Autopilot service and then the device will be recognized during OOBE as an Autopilot device, and will show a customized enrollment experience.

The Problem

Many organizations are still using Windows 7 and are on it’s way to Windows 10.  Windows 10 is the new “baseline” in this story. It’s aligned with the complete modern management story. It provides the capability to join Azure AD and the usage of a Windows as a Service model.

How do we get to the new baseline?

If we purchase a new device, the OEM vendor takes care of installing Windows 10 with a signature edition and all necessary drivers. In future the hardware information will be synced into our tenant from the OEM vendor. We will get the device information in Intune and we can start to assign an Autopilot Deployment Profile and start to enroll the device.

DeviceEnrollment

What if we have a bunch of Windows 7 devices in the environment?

A way to handle this is that we are playing the role of the OEM vendor and do the install of a Windows 10 signature edition on the existing Windows 7 devices. Depending what is available we can use ConfigMgr or MDT. In the context of modern management I like to keep on-premises software as low as possible. I use MDT for that simple task now. If ConfigMgr is available we can build the following the same way.

I use MDT to create a Deployment USB media (removable drive) for that and build up a Standard Task Sequence to deploy Windows 10 for this. We take care of the right drivers and in the end we let the device start the OOBE again (sysprep.exe /oobe /reboot|shutdown). Now we have the same situation like a newly delivered device by the OEM vendor. But we can’t deliver the hardware information directly into our tenant like the OEM vendor will do in the future. Good to know that we can get the hardware information with the PowerShell Script Get-WindowsAutoPilotInfo and upload the information provided via a .csv file our self.

Now imagine a situation where a rollout team is preparing a lot of machines. We would end up in a lot of .csv files on different USB removable drives. To make this a little easier for IT to import the hardware information of new devices to Autopilot, we build up the following logic:

 

First of all we prepare the Blob Storage for easy csv file storage.

Login to Azure portal and click on “Storage accounts

StorageAccount

Click Add

StorageAccountAdd

fill out name, Account kind: Blob storage

StorageAccountCreation

after creation you should see the storage account

StorageAccountOverview

create a container called hashes

StorageAccountContainer

create a shared access signature for Blob | Write | an extended expiry date/time | HTTPS only and create a SAS token. Shared Access Signature is used to limit the permission and the limit the period of time to access the account. See Delegating Access with a Shared Access Signature

StorageAccountSAS

Copy the SAS token as we need it in the following script.

Download PowerShell Script Get-WindowsAutoPilotInfo and AzCopy. Install AzCopy and get the files from here: C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy

Copy AzCopy files and Get-WindowsAutoPilotInfo.ps1 into MDT share e.g. C:\DeploymentShare\Scripts\CUSTOM\HardwareInfo

Create PowerShell script: Get-HardwareInformation.ps1 and copy to the MDT folder HardwareInfo as well. Replace the SAS token (ending with XXXX) in the script example with the newly created one. Replace ZZZZ with your Storage account name.

The script will look for the Get-WindowsAutoPilotInfo.ps1 script, executes it and creates a computername.csv file in C:\Windows\Temp. From here it will be copied to the blob storage account and copied to the USB removable drive folder autopilot-script-success or autopilot-script-failed. This provides the chance in case of failure (missing internet access during deployment) that the computername.csv can be gathered from the USB drive as well.

hardwareinfocsv


# Author: Oliver Kieselbach
# Date: 11/15/2017
# Description: Generate AutoPilot .csv file and upload to Azure Blob Storage.

# The script is provided "AS IS" with no warranties.

# Downlaod URL for AzCopy:
# http://aka.ms/downloadazcopy

# Downlaod URL for Get-WindowsAutoPilotInfo:
# https://www.powershellgallery.com/packages/Get-WindowsAutoPilotInfo

Function Execute-Command
{
 Param([Parameter (Mandatory=$true)]
 [string]$Command,
 [Parameter (Mandatory=$false)]
 [string]$Arguments)

$pinfo = New-Object System.Diagnostics.ProcessStartInfo
 $pinfo.FileName = $Command
 $pinfo.RedirectStandardError = $true
 $pinfo.RedirectStandardOutput = $true
 $pinfo.CreateNoWindow = $true
 $pinfo.UseShellExecute = $false
 $pinfo.Arguments = $Arguments
 $p = New-Object System.Diagnostics.Process
 $p.StartInfo = $pinfo
 $p.Start() | Out-Null
 $p.WaitForExit()
 [pscustomobject]@{
 stdout = $p.StandardOutput.ReadToEnd()
 stderr = $p.StandardError.ReadToEnd()
 ExitCode = $p.ExitCode
 }
}

$scriptPath = [System.IO.Path]::GetDirectoryName($MyInvocation.MyCommand.Path)
$fileName = "$env:computername.csv"
$outputPath = Join-Path $env:windir "temp"
$outputFile = Join-Path $outputPath $fileName
$autoPilotScript = Join-Path $scriptPath "Get-WindowsAutoPilotInfo.ps1"

Execute-Command -Command "$psHome\powershell.exe" -Arguments "-ex bypass -file `"$autoPilotScript`" -ComputerName $env:computername -OutputFile `"$outputFile`"" | Out-Null

$url = "https://ZZZZ.blob.core.windows.net/hashes"
$sasToken = "?sv=2017-04-17&ss=b&srt=o&sp=w&se=2019-10-16T19:47:51Z&st=2017-10-15T11:47:51Z&spr=https&sig=XXXX"
$result = Execute-Command -Command "`"$scriptPath\azcopy.exe`"" -Arguments "/Source:`"$outputPath`" /Dest:$url /Pattern:$fileName /Y /Z:`"$outputPath`" /DestSAS:`"$sasToken`""

if ($result.stdout.Contains("Transfer successfully:  1"))
{
 if (-not (Test-Path $(Join-Path $scriptPath "autopilot-script-success")))
 {
 New-Item -Path $(Join-Path $scriptPath "autopilot-script-success") -ItemType Directory | Out-Null
 }
 Copy-Item -Path $outputFile -Destination $(Join-Path $scriptPath "autopilot-script-success") -Force -ErrorAction SilentlyContinue | Out-Null
}
else
{
 if (-not (Test-Path $(Join-Path $scriptPath "autopilot-script-failed")))
 {
 New-Item -Path $(Join-Path $scriptPath "autopilot-script-failed") -ItemType Directory | Out-Null
 }
 Copy-Item -Path $outputFile -Destination $(Join-Path $scriptPath "autopilot-script-failed") -Force -ErrorAction SilentlyContinue | Out-Null
}

UPDATE 22/07/2018: I have an enhanced version of the gather script now which can be found on my GitHub account. The enhanced version does not have the dependency on AzCopy.exe (incl. dependency files) and Get-WindowsAutoPilotInfo.ps1 in the script directory. If they are not available, they are downloaded from an additional Blob Storage container named resources. The additional container resources must be created and the AzCopy.zip and Get-WindowsAutoPilotInfo.ps1 must be uploaded there to successfully run the script. The scrip is part of a complete automation solution – Automation of gathering and importing Windows Autopilot information

Create another PowerShell script: Download-HardwareInformation.ps1
This can be used later on to download all the .csv files from Azure Blob Storage and create the combined .csv for easy upload to Autopilot. Leave this script on your admin workstation. Replace the StorageAccountKey XXXX with one of your storage account access keys! Replace ZZZZ with your Storage account name.

StorageAccountAccessKey


# Author: Oliver Kieselbach
# Date: 11/15/2017
# Description: Gather AutoPilot .csv file from Azure Blob Storage, delete them and combine into single .csv file.

# The script is provided "AS IS" with no warranties.

#Install-Module AzureRM

$ctx = New-AzureStorageContext -StorageAccountName ZZZZ -StorageAccountKey XXXX
$path = "C:\temp"
$combinedOutput = "C:\temp\combined.csv"

$count = $(Get-AzureStorageContainer -Container hashes -Context $ctx | Get-AzureStorageBlob |measure).Count
if ($count -gt 0)
{
 Get-AzureStorageContainer -Container hashes -Context $ctx | Get-AzureStorageBlob | Get-AzureStorageBlobContent -Force -Destination $path
 $downloadCount = $(Get-ChildItem -Path $path -Filter *.csv | measure).Count
 if ($downloadCount -eq $count)
 {
 Get-AzureStorageContainer -Container hashes -Context $ctx | Get-AzureStorageBlob | Remove-AzureStorageBlob
 }
 # parse all .csv files and combine to single one for easy upload!
 Set-Content -Path $combinedOutput -Value "Device Serial Number,Windows Product ID,Hardware Hash" -Encoding Unicode
 Get-ChildItem -Path $path -Filter "*.csv" | % { Get-Content $_.FullName | Select -Index 1 } | Add-Content -Path $combinedOutput -Encoding Unicode
}

I assume the MDT share is build and a Standard Task Sequence for a vanilla Windows 10 installation is available. Then we add a task sequence step “Run PowerShell Script” to the folder “Custom Tasks“:

RunPowerShellScriptStep

and configure the Get-HardwareInformation.ps1 script:

GetHardwareInformation

Now you are ready to run a MDT deployment of a Windows 10 with an automatic upload of the hardware information to the Azure Blob Storage.

After deployment of the devices you can use the Download-HardwareInformation.ps1 Script to get the combined.csv file and upload it to Microsoft Store for Business (MSfB) or Intune Portal. The upload is currently available in this portal only.

MSfBAutoPilotUpload

I recommend to use the MSfB to upload the combined.csv only! Management of the devices and profiles should be done in Intune. Currently the portals do not completely share their information. For example a profile created in MSfB will not be shown in Intune and vice versa. With Modern Management where Intune is used I suggest to use MSfB to upload devices and Intune for management of profiles (creation and assignment).

In the meantime you have the full functionality in Intune, see more here: Autopilot profile assignment using Intune

 

Happy Autopiloting!

There is another great article from Per Larsen (MVP):
How to collect hardware hash to use in AutoPilot as part of MDT OSD