Tim D'Annecy


tdannecy@gmail.com

#Windows #Powershell #Networking

A company I'm working with is looking to move from an OpenVPN connection to a Meraki VPN on newly installed MX hardware.

To accomplish this, I wrote a short script that can be deployed in GPO that adds the new VPN connection and uninstalls the existing OpenVPN application.

Here's the script:

# Migrate-VPN.ps1
# Adds a new Meraki VPN config and removes the existing OpenVPN GUI application.
# Tim D'Annecy 2022-08-04

Start-Transcript -Path 'C:\temp\Migrate-VPN.log'
function Add-VPN {
  $ConnectionName = 'New VPN'
  $ServerAddress = 'XXXyourhostnameXXX'
  $PresharedKey = 'XXXyourpskXXX'

  $check = Get-VpnConnection -Name $ConnectionName -AllUserConnection -ErrorAction SilentlyContinue

  if ($check) {
    Write-Host 'VPN connection named' $ConnectionName 'already exists. Exiting.'
  }
  else {
    Write-Host 'Adding VPN connection' $ConnectionName
    Add-VpnConnection `
      -Name $ConnectionName `
      -ServerAddress $ServerAddress `
      -TunnelType L2tp `
      -EncryptionLevel Optional `
      -L2tpPsk $PresharedKey `
      -AuthenticationMethod Pap `
      -RememberCredential $true `
      -AllUserConnection  $true `
      -Force `
      -WarningAction SilentlyContinue
  }
}
Add-VPN

function Remove-OpenVPN {
  if (Test-Path -Path 'C:\Program Files\OpenVPN') {
    Write-Host 'OpenVPN installed. Removing...'
    (Get-WmiObject -Class Win32_Product -filter "Name LIKE 'OpenVPN%'").Uninstall() | Out-Null
  }
  else {
    Write-Host 'OpenVPN not installed. Exiting.'
  }
}
Remove-OpenVPN

Stop-Transcript

Copy and paste this script into your \\domain.com\SYSVOL\scripts folder and save it as Migrate-VPN.ps1.

Once you've done this, go into Group Policy Management and create a new GPO Object that does 3 things:

  • Create a folder at C:\temp

  • Copy the file from \\domain.com\SYSVOL\scripts\Migrate-VPN.ps1 to C:\temp\Migrate-VPN.ps1

  • Run a Scheduled Task that calls Powershell to run the script every hour on the hour

With these things in place, you should see the changes trickle out to your environment as the machines check in.

Discuss...

#Windows #Powershell #Azure

I have a few companies that I work with that are using a traditional Active Directory domain environment (GPO, WSUS, etc.) but are not using an inventory tool like Intune or PDQ.

One of the biggest issues that they report is that they aren't able to get any information about live machines in their environment.

Gathering this information is a critical step in moving to cloud-based endpoint management. You won't be able to decommission a domain if there are objects that still check back in to on-prem infrastructure for management.

To work around this, I wrote a Powershell script that runs on a local computer, gathers some information about its config, then pushes it to an Azure Table. This collected data can then be exported to a .csv file and can be ingested into other tools for analytics.

Azure Storage Account and Table

Open the Azure portal and create a new Storage Account. Keep all of the defaults and step through the wizard.

Once the deployment is complete, navigate to the Storage Account and select Tables. In this view, create a table named “domaineddevices”:

Screenshot of Azure portal, viewing a Table inside a Storage Account

After creating the Table, navigate to the Access keys blade. Copy this key and paste it into the $accesskey line in the script below:

Screenshot of Azure portal, viewing the Access Keys inside a Storage Account

For better compatibility in your environment, change the Minimum TLS version to 1.0 under the Configuration blade. This will allow older versions of Windows to check in with the Table:

Screenshot of Azure portal, viewing the Configuration blade inside a Storage Account

Once this Storage Account is setup, move to the Powershell section and paste in your Key that you copied earlier.

Powershell script

I was struggling with writing to an Azure Table, specifically creating the needed encryption pieces. I found a few posts [A] [A] that had the main crypto pieces that I needed. I wrote the rest of the information gathering lines and tweaked it to successfully upload the data that the script gathered to Azure Tables.

Here's my modified Powershell script:

# Check-DomainStatus.ps1

$ScriptVersion = 20220802

Start-Transcript -Path 'C:\temp\Check-DomainStatus.log' -Append -NoClobber
$storageAccount = 'STORAGEACCOUNT' # Update these values for your environment
$accesskey = 'XXX' # Update these values for your environment
$TableName = 'domaineddevices'
$DomainName = 'XXX' # Update these values for your environment

function InsertReplaceTableEntity($TableName, $PartitionKey, $RowKey, $entity) {
    $version = "2017-04-17"
    $resource = "$tableName(PartitionKey='$PartitionKey',RowKey='$Rowkey')"
    $table_url = "https://$storageAccount.table.core.windows.net/$resource"
    $GMTTime = (Get-Date).ToUniversalTime().toString('R')
    $stringToSign = "$GMTTime`n/$storageAccount/$resource"
    $hmacsha = New-Object System.Security.Cryptography.HMACSHA256
    $hmacsha.key = [Convert]::FromBase64String($accesskey)
    $signature = $hmacsha.ComputeHash([Text.Encoding]::UTF8.GetBytes($stringToSign))
    $signature = [Convert]::ToBase64String($signature)
    $headers = @{
        'x-ms-date'    = $GMTTime
        Authorization  = "SharedKeyLite " + $storageAccount + ":" + $signature
        "x-ms-version" = $version
        Accept         = "application/json;odata=fullmetadata"
    }
    $body = $entity | ConvertTo-Json
    Invoke-RestMethod -Method PUT -Uri $table_url -Headers $headers -Body $body -ContentType application/json
}

# GPO calculation
$RegPath = 'HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Group Policy\State\Machine\Extension-List\{00000000-0000-0000-0000-000000000000}'
$LowTime = Get-ItemProperty -path $RegPath -name "EndTimeLo"
$HighTime = Get-ItemProperty -path $RegPath -name "EndTimeHi"
$CompTime = ([long]$HighTime.EndTimeHi -shl 32) + [long] $LowTime.EndTimeLo
$GPOProcessDate = [DateTime]::FromFileTime($CompTime)

# Reduce some calls
$dsregcmd = (C:\Windows\System32\dsregcmd.exe /status)
$computerinfo = Get-ComputerInfo 
$wmiobjectw32 = Get-WmiObject -class win32_bios

$body = @{
    # Required values 
    RowKey                     = $($env:COMPUTERNAME)
    PartitionKey               = 'domaineddevices'

    # Optional values
    AzureADJoinedStatus        = ($dsregcmd | Select-String 'AzureADJoined' | Out-String).replace(' ', '').replace("`n", '').replace("`r", '')
    AdminAccountPresent     = if ((Get-LocalUser).Name -Contains 'LocalAdmin' ) { $true } else { $false }
    Domain                     = $env:USERDOMAIN
    DomainJoinStatus           = ($dsregcmd | Select-String 'DomainJoined' | Out-String).replace(' ', '').replace("`n", '').replace("`r", '')
    EnterpriseJoinedStatus     = ($dsregcmd | Select-String 'EnterpriseJoined' | Out-String).replace(' ', '').replace("`n", '').replace("`r", '')
    FortiClientVPNFilesPresent = if (Test-Path -Path 'C:\Program Files\Fortinet\FortiClient' -ErrorAction SilentlyContinue) { $true } else { $false }
    FortiClientVPNRunning      = if (Get-Process -ProcessName 'FortiTray' -ErrorAction SilentlyContinue) { $true } else { $false }
    # # GPOProcessDate             = [datetime]::FromFileTime(([Int64] ((Get-ItemProperty -Path "Registry::HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Group Policy\State\Machine\Extension-List\{00000000-0000-0000-0000-000000000000}").startTimeHi) -shl 32) -bor ((Get-ItemProperty -Path "Registry::HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Group Policy\State\Machine\Extension-List\{00000000-0000-0000-0000-000000000000}").startTimeLo)).toString()
    GPOProcessDate             = [datetime]$GPOProcessDate
    LogonServer                = $env:LOGONSERVER | Out-String
    Manufacturer               = ($wmiobjectw32).Manufacturer
    NetworkMAC                 = (Get-WmiObject win32_networkadapterconfiguration | Select-Object Description, MACaddress, IPAddress, DefaultIPGateway, DNSDomain) | Out-String
    OSBuild                    = (($computerinfo).OsHardwareAbstractionLayer | Out-String).replace(' ', '').replace("`n", '').replace("`r", '')
    OSEdition                  = (($computerinfo).WindowsProductName | Out-String).replace(' ', '').replace("`n", '').replace("`r", '')
    OSVersion                  = [int32]((($computerinfo).WindowsVersion | Out-String).replace(' ', '').replace("`n", '').replace("`r", ''))
    QuestODMAgentRunning       = if (Get-Process -ProcessName 'BinaryTree.ADM.Agent' -ErrorAction SilentlyContinue) { $true } else { $false }
    QuestODMFilesPresent       = if (Test-Path -Path 'C:\Program Files (x86)\Quest\On Demand Migration Active Directory Agent' -ErrorAction SilentlyContinue) { $true } else { $false }
    ScriptVersion              = [int32]$ScriptVersion
    SerialNumber               = ($wmiobjectw32).SerialNumber
    StorageType                = (Get-PhysicalDisk).MediaType | Out-String
    Traceroute                 = (Test-NetConnection -TraceRoute $DomainName -Hops 5 -ErrorAction SilentlyContinue) | Out-String
    Uptime                     = (New-TimeSpan -Start (Get-CimInstance -Class Win32_OperatingSystem -Property LastBootUpTime).LastBootUpTime -End (Get-Date)).ToString()
    Users                      = (Get-ChildItem -Path 'C:\Users\' | ForEach-Object {
            $size = Get-ChildItem -Path $_.FullName -Recurse -ErrorAction SilentlyContinue | Measure-Object -Property Length -Average -Sum -ErrorAction SilentlyContinue
            Write-Output $_.Name, $_.LastWriteTime.ToString("yyyy-MM-dd"), "$([math]::round($size.sum/1GB)) GB", '---' }) | Out-String
    WindowsVPNManualStatus     = (Get-VpnConnection -ErrorAction SilentlyContinue).Name | Out-String
    WindowsVPNStatus           = (Get-VpnConnection -AllUserConnection -ErrorAction SilentlyContinue).Name | Out-String
}

Write-Host 'Creating new or updating table entity'
InsertReplaceTableEntity -TableName $TableName -entity $body -RowKey $body.RowKey -PartitionKey $body.PartitionKey

Write-Host 'Outputting all values for log:'
Write-Host $body 
Stop-Transcript

Save that script to somewhere like SYSVOL.

Group Policy Object

After saving the file to the domain controller, create a GPO with the following items:

Computer Configuration > Preferences > Windows Settings > File

General tab:

Screenshot of Group Policy Management Editor File wizard

  • Source file(s): \\domain.local\SYSVOL\domain.local\scripts\Check-DomainStatus.ps1

  • Destination FIle: C:\temp\Check-DomainStatus.ps1

Computer Configuration > Control Panel Settings > Scheduled Tasks

General tab:

Screenshot of Group Policy Management Editor Scheduled Tasks wizard

  • Action: Replace

  • Name: Check-DomainStatus

  • When running the task, use the following user account: NT AUTHORITY\System

  • Run whether user is logged on or not

  • Run with highest privileges

  • Configure for: Windows Vista or Windows Server 2008

Triggers tab:

Screenshot of Group Policy Management Editor Scheduled Tasks wizard

  • New > Begin the task: On a schedule

  • Daily, Recur every: 1 days

  • Repeat task every: 1 hour for a duration of: 1 day

  • Enabled

Actions tab:

  • New > Action > “Start a program”

  • Program/script: powershell.exe

  • Add arguments(optional): -NoProfile -ExecutionPolicy Bypass -File "c:\temp\Check-DomainStatus.ps1"

Conditions tab:

Screenshot of Group Policy Management Editor Scheduled Tasks wizard Conditions tab

  • All options unchecked.

Settings tab:

Screenshot of Group Policy Management Editor Scheduled Tasks wizard Settings tab

  • Allow task to be run on demand

  • Run task as soon as possible after a scheduled start is missed

  • Stop the task if it runs longer than 1 hour

  • If the running task does not end when requested, force it to stop

  • If the task is already running, then the following rule applies: Do not start a new instance

Once deployed, the task will be available on the local machine in Task Scheduler and can be started immediately:

Screenshot of Task Scheduler MMC

Azure Storage Explorer

After deploying the script, you can use the Azure Storage Explorer app to view and export the data as it arrives:

Screenshot of Azure Storage Explorer opening a Table

Discuss...

#Windows #Powershell

One of the companies I work with just added RADIUS authentication to an SSID on Meraki APs. To do this, they setup an NPAS role on the domain controller and connected it with the Meraki config.

Most users are working. Some are experiencing issues logging into the network, receiving the message “Can't connect to this network” when they try to authenticate:

Screenshot of Windows 10 wifi network message saying 'Can't connect to this network'

I troubleshooted everything I could think of: local machine, domain trust, user password, Radius/LDAP settings, Meraki authentication settings, etc. and found that the issue was the msNPAllowDialin attribute when it was set to “False”.

You can view this attribute by opening a user in ADUC when you're remoted onto a domain controller. Make sure you have the option checked under View > Advanced Features. Open the Dial-in tab and check the Network Access Permission field:

Screenshot of Active Directory Users and Computers Dial-In tab

Now that I know the root cause, I wanted to find how many active users were affected. To get a list of all the users, I ran this Powershell command:

Import-Module ActiveDirectory
Get-ADUser -Filter "enabled -eq 'true'" -Properties Name,msNPAllowDialin | Select-Object Name,msNPAllowDialin | Sort-Object -Property Name | Export-Csv -Path .\out.csv -NoTypeInformation

I opened the CSV in Excel and was able to sort by “False” and find the users that had the attribute.

From what I understand, the msNPAllowDialin attribute should be “null” to allow NPAS to handle the authentication.

With this in mind, I was able to clean up the environment by running this Powershell command:

Get-ADUser -Filter "enabled -eq 'true'" -Properties Name,msNPAllowDialin | Where-Object {$_.contains('msNPAllowDialin') -eq $true} | Set-ADUser -Clear msNPAllowDialin

Alternatively, as described by this Microsoft Doc [A] I could have checked the option inside the NPAS settings for “Ignore user account dial-in properties”.

Discuss...

#Windows

I needed to repair an Office 365 installation on a PC that had UAC turned off. I ran into problems:

  • I couldn't switch users on a screenshare to open the GUI. So I needed to use the Command Prompt.

  • I couldn't remember the command to repair office in CMD.

I started looking.

Once I located the command on this page [A], I found the arguments I needed, but the path to the ClickToRun.exe file was pointing to an older version of Office.

Here's the updated command for repairing 64-bit Office 365 applications from the command line:

'C:\Program Files\Common Files\microsoft shared\ClickToRun\OfficeClickToRun.exe' scenario=Repair

Footer image

Discuss...

#Powershell #Exchange

I got a request to create a Dynamic Distribution List/Group in Exchange that was automatically populated based on the users' office location.

The requestor stated that they do not want to manage any additional O365 objects. I know how to do this in Azure AD with a Dynamic Assignment, but needed to figure out how to do this in Exchange Online.

Luckily, it's pretty easy.

You'll need the Exchange Online Powershell module before running the command.

Import-Module ExchangeOnline 
Connect-ExchangeOnline
New-DynamicDistributionGroup -Name 'Raleigh Staff' -Alias 'Raleigh.Staff' -RecipientFilter "(RecipientTypeDetails -eq 'UserMailbox') -and (Office -eq 'Raleigh')"

It might take a few minutes, but after running that command, you'll see it update in the Exchange Online portal and the query will add users to the List/Group.

Discuss...

#Powershell #Windows #Networking

I have a client that is transitioning their network equipment from Fortigate to Meraki. Part of this transition is testing the Meraki Client VPN instead of the FortiClient application.

We found that that on first run, the FortiClient VPN app disables some services that are needed for the Meraki VPN connection to successfully authenticate. If users don't have Local Admin permissions, they are unable to make any changes to the services to fix the issue.

To work around this, I created a small PowerShell script that can be deployed through GPO or Intune. It stops all of the FortiClient services and processes and re-enables the services that Meraki's VPN uses. It also creates a transcript and stores the log to C:\Fix-MerakiVPN.log that you can use for troubleshooting.

Here's the script:

#Requires -Version 1
<#
.SYNOPSIS
  Closes and disables FortiClient VPN services and apps. Checks and configures Windows services to allow Meraki VPN connection.
.DESCRIPTION
  Closes and disables FortiClient VPN services and apps. Checks and configures Windows services to allow Meraki VPN connection.
.INPUTS
  None
.OUTPUTS
  Log file stored in C:\Fix-MerakiVPN.log
.NOTES
  Version:        1.0
  Author:         Tim D'Annecy
  Creation Date:  2022-06-07
  Purpose/Change: Initial script development
.EXAMPLE
  Fix-MerakiVPN.ps1 
#>

$ServicesToStop = 'FA_Scheduler'#, 'FMAPOService'
$ServicesToStart = 'PolicyAgent', 'IKEEXT'
$AppsToStop = 'FortiClient', 'FortiSettings', 'FortiSSLVPNdaemon', 'FortiTray'

function Fix-MerakiVPN {
  foreach ($App in $AppsToStop) {
    if (Get-Process -Name $App -ErrorAction SilentlyContinue) {
      Write-Host 'Application running. Stopping:' $App
      Stop-Process -Name $App -Force 
    }
    else {
      Write-Host 'OK: Application not running or not installed:' $App
    }
  }
  foreach ($service in $ServicesToStop) {
    if ((Get-Service $service -ErrorAction SilentlyContinue).status -eq 'Running') {
      Write-Host 'Service running. Stopping:' $service
      $ServicePID = (get-wmiobject win32_service | Where-Object { $_.name -eq $service }).processID
      Stop-Process $ServicePID -Force
      Set-Service $service -StartupType Disabled
    }
    else {
      Write-Host 'OK: Service not running or not installed:' $service
    }
  }
  foreach ($service in $ServicesToStart) {
    if ((Get-Service $service -ErrorAction SilentlyContinue).status -eq 'Running') {
      Write-Host 'OK: Service running:' $service
    }
    else {
      Write-Host 'Service not running. Starting:' $service
      Set-Service $service -StartupType Automatic -Status Running 
      Start-Service $service 
    }
  }
}

Start-Transcript -Path 'C:\Fix-MerakiVPN.log' -Append
Fix-MerakiVPN
Stop-Transcript

Discuss...

#Windows #Powershell

I received this error on a fresh Windows 10 install a machine provisioned by Autopilot.

I found a fix on this post [A] and wanted to paste out the Powershell command for future reference.

$WinRMClient = "HKLM:\SOFTWARE\Policies\Microsoft\Windows\WinRM\Client"
$Name = "AllowBasic"
$value = "1"
IF (!(Test-Path $WinRMClient)) {
   New-Item -Path $WinRMClient -Force | Out-Null
   New-ItemProperty -Path $WinRMClient -Name $name -Value $value -PropertyType DWORD -Force | Out-Null
} ELSE {
   New-ItemProperty -Path $WinRMClient -Name $name -Value $value -PropertyType DWORD -Force | Out-Null
}

Discuss...

#Azure

I received a request for a hosted SFTP solution for one of the clients I work with.

Currently, there are some recommended templates from Microsoft [A] that include Blob storage and a Debian container that serves up the SFTP service. While this solution will work, I'm looking for a solution that's easier to manage.

Recently in Preview, Azure now has the ability to add the SFTP endpoint and features to a Storage Account.

Here's how you do it.

Open the Azure Portal and create a new Storage Account. You should be able to add this feature to existing Storage Accounts, as long as they are StorageV2 (general purpose v2) of Block Blob.

Screenshot of Azure Dashboard creation of new Storage Account

On the Advanced tab, check the boxes for “Enable hierarchical namespace” and “Enable SFTP (preview)”:

Screenshot of the Azure Create a Storage Account wizard Advanced tab

Leave the rest of the options as defaults.

After creating the resource, navigate to the “SFTP (Preview)” blade under the Settings header. From there, click “Add local user”. Type the name for the user and check the box to use an SSH Password:

Screenshot of Azure Storage account SFTP preview, Add Local User option, Username + Authentication tab

Still in the “Add local user” side menu, click on the “Container permissions” tab and add a new container that you plan to use for storage. Change the permissions to fit what kind of access you need, then set the “Home directory” to the virtual folder you want to use. If this value is not set correctly, you'll have connection/mounting issues later on. To keep things simple, I set it to the Container that I set in the top row:

Screenshot of Azure Storage account SFTP preview, Add Local User option, Container permissions tab

Copy the SFTP user password somewhere like Notepad and return to the SFTP blade.

After getting the SFTP features set up, you can connect to your container using the connection information in the “Connection string” option. Click on the Copy icon:

Screenshot of Azure Storage account SFTP preview, Connection string option

Next, get an SFTP app like Filezilla to setup the connection. Paste the “Connection string” into Filezilla's “Host” field. The username will populate from this information. Paste in the password that you saved from earlier, then click “Quickconnect”:

Screenshot of Filezilla with a successful SFTP connection to Azure Storage

And that's it!

For it to be fully featured, there are a few things that I would like to see added (the ability to use Azure AD accounts and Managed Identities). Right now, it's working great as a fully managed SFTP endpoint that you can use for a few dollars per month: It supports regular Blob features (Soft Delete, Azure Monitoring and Firewalls, etc.) and can hook into a vNet using a Service Endpoint.


Additional resources:

Discuss...

#Azure #Powershell

If you have a ton of compute and storage resources in your Azure environment, it can be difficult to tell which managed disks are orphaned or not mounted to a virtual machine. This Microsoft Doc [A] has the AZ CLI command, but it's buried in a larger task to delete the found objects. This Docs page also doesn't have the Powershell equivalent and I prefer to use Powershell.

To find these disks, run one of these commands in a Cloud Shell or other Azure connected terminal:

AZ CLI:

az disk list --query '[?managedBy==`null`].[id]' -o tsv

Powershell:

Get-AzDisk | Where-Object {$_.ManagedBy -eq $null}

Discuss...

#Windows #Powershell

One of the companies I'm working with has an Intune installation package for Adobe Acrobat Pro DC version 15.007.20033, but seems to have an issue with signing in on any PC that gets the deployment. Even newly imaged computers running Windows 10 21H2 get the error.

The package was created in Intune as a regular Line of Business app using a freshly generated .msi file from the Adobe Admin console under the Packages tab:

To get the application rolled out: I created a security group in Azure Active Directory named “Adobe Acrobat Pro DC users” that is used for the following tasks (not in this order):

  1. Uninstalls Adobe Reader DC (This removal is to simplify the user experience opening .pdf files, but isn't needed for functionality)

  2. Provisions an Adobe Acrobat Pro DC license using a configuration in Enterprise Applications

  3. Installs the Adobe Acrobat Pro DC .msi file

After the Adobe Acrobat Pro DC installation is complete on a user's computer and on first run, the user is prompted to login with their Adobe account. Since these users are already provisioned, it should be an easy click through. When the user hits the signin page, however, an error message appears and doesn't let the user continue:

Update required: Your browser or operating system is no longer supported. You may need to install the latest updates to your operating system. Learn more.

This seems to be a bug and users are reporting the issue on the Adobe Community forums [A]. The post notes that the issue is with an older version of the file AASIapp.exe that is causing that update error message. To work around this, they provide some steps from Adobe Support that can be used to fix the issue.

I wanted to make this deployable in Intune, so I wrote the following script:

function Invoke-AdobeAcrobatDCFix {
    $DownloadURI = 'http://prdl-download.adobe.com/Framemaker/428037A8066D4558A7EF7D7D06CB5B72/1600836995996/AASIapp.exe'
    $DownloadDestination = 'C:\temp\AASIapp.exe'
    $AppDestination = 'C:\Program Files (x86)\Common Files\Adobe\OOBE\PDApp\P7'

    Invoke-WebRequest -Uri $DownloadURI -OutFile (New-Item -Path $DownloadDestination -Force)
    Copy-Item -Path $DownloadDestination -Destination $AppDestination -Force
}

Invoke-AdobeAcrobatDCFix

To get this working in your environment, follow these steps:

  1. Copy the script snippet above and paste it into a text editor. Save it as a .ps1 file.

  2. Open https://endpoint.microsoft.com/

  3. Navigate to Devices > Scripts > Add > Windows 10 and later:

  1. Move through the wizard to upload and configure your script deployment:

    1. Basics: Name it something you'll remember and add a description.

    2. Script settings: Upload the .ps1 file you saved earlier. Leave the other options on the default “No” setting.

    3. Assignments: Select the user group that you're using for the Adobe Acrobat Pro DC app deployment. In my environment, this is the “Adobe Acrobat Pro DC users” security group.

  2. Keep the Scripts tab open for a few seconds. After the upload message pops up, the deployment will begin to sync to devices:

You can check the deployment process on the PC by looking for the C:\temp\ folder or for a newer timestamp on the file at C:\Program Files (x86)\Common Files\Adobe\OOBE\PDApp\P7\AASIapp.exe:

If the script fails, you can check the Intune application log at C:\ProgramData\Microsoft\IntuneManagementExtension\Logs\AgentExecutor.log for Powershell error messages.

If Adobe decides to stop hosting the file, this process could stop working. You might want to download that .exe file and put it in a public container in an Azure Storage account.

I'm sure this script could be shorter and the process could be more streamlined (I'm thinking editing the .msi file), but it's working for me and doesn't require too much upkeep. After assigning a user the Adobe Acrobat Pro DC users security group, after a little bit of time, the user will have a fully working Adobe Pro installation.

I hope this helps!

Discuss...

Enter your email to subscribe to updates.