Skip to main content

PowerShell and Python Automation

Most enterprise Azure operational work involves automation at two levels: Windows-side scripting for administration, discovery, and data collection (PowerShell), and data-side scripting for analysis, API integration, and tooling (Python). The split between the two is clear in practice — the tricky part is knowing where one ends and the other begins.

My rule: PowerShell when the work is Windows, Azure CLI/API calls, or scheduled OS tasks. Python when the work is data-heavy, API-heavy, or becoming a real tool.

PowerShell patterns

WinRM connectivity testing across a server list

I use this before any large-scale remoting operation — before running discovery scripts, before deploying agents, before any mass configuration change. It tells me which servers are reachable, which are pingable but WinRM-blocked, and which are just down:

param (
[Parameter(Mandatory=$true)] [string]$TargetListPath,
[Parameter(Mandatory=$true)] [string]$OutputPath,
[Parameter(Mandatory=$true)] [System.Management.Automation.PSCredential]$Credential
)

function Test-WinRMConnection {
param(
[string]$ComputerName,
[System.Management.Automation.PSCredential]$Credential
)
$sessionOption = New-PSSessionOption -OperationTimeout 60000

$pingResult = Test-Connection -ComputerName $ComputerName -Count 1 -Quiet
if (-not $pingResult) {
return [PSCustomObject]@{
ComputerName = $ComputerName
PingSuccess = $false
WinRMSuccess = $false
Error = "No ping response"
}
}

try {
$session = New-PSSession -ComputerName $ComputerName -Credential $Credential `
-SessionOption $sessionOption -ErrorAction Stop
$osInfo = Invoke-Command -Session $session -ScriptBlock {
[PSCustomObject]@{
OSName = (Get-CimInstance Win32_OperatingSystem).Caption
OSVersion = (Get-CimInstance Win32_OperatingSystem).Version
}
}
Remove-PSSession $session
return [PSCustomObject]@{
ComputerName = $ComputerName
PingSuccess = $true
WinRMSuccess = $true
OSName = $osInfo.OSName
OSVersion = $osInfo.OSVersion
Error = ""
}
} catch {
return [PSCustomObject]@{
ComputerName = $ComputerName
PingSuccess = $true
WinRMSuccess = $false
Error = $_.Exception.Message
}
}
}

$TargetList = Get-Content $TargetListPath
$Results = $TargetList | ForEach-Object { Test-WinRMConnection -ComputerName $_ -Credential $Credential }
$Results | Export-Csv -Path $OutputPath -NoTypeInformation
Write-Host "Results written to $OutputPath"

-OperationTimeout 60000 is in milliseconds. For servers that hang on connection attempts, this prevents the script from stalling indefinitely. Add untrusted hosts to TrustedHosts first: Set-Item WSMan:\localhost\Client\TrustedHosts -Value "*" -Force.

Network connection collection for dependency analysis

Before Azure Migrate agent-based dependency analysis is deployed, or on servers where the agent can't be installed, I collect network connections using a scheduled script:

param(
[string]$CsvPath = "C:\Logs\ConnectionData.csv",
[int]$RetainHours = 6,
[int]$SampleSeconds = 30
)

function Get-NetworkConnections {
$rawNetstat = netstat -ano | Select-Object -Skip 4
foreach ($line in $rawNetstat) {
if ($line -match '^\s*(?<Protocol>\S+)\s+(?<LocalAddress>\S+)\s+(?<RemoteAddress>\S+)(\s+(?<State>\S+))?\s+(?<PID>\d+)$') {
$pid = $Matches['PID']
$processName = (Get-Process -Id $pid -ErrorAction SilentlyContinue)?.ProcessName
[PSCustomObject]@{
Timestamp = Get-Date -Format 'yyyy-MM-dd HH:mm:ss'
Protocol = $Matches['Protocol']
LocalAddress = $Matches['LocalAddress']
RemoteAddress = $Matches['RemoteAddress']
State = if ($Matches['State']) { $Matches['State'] } else { "LISTENING" }
PID = $pid
ProcessName = $processName
}
}
}
}

# Collect and append
$connections = Get-NetworkConnections
if (Test-Path $CsvPath) {
$connections | Export-Csv -Path $CsvPath -Append -NoTypeInformation
} else {
$connections | Export-Csv -Path $CsvPath -NoTypeInformation
}

# Retention cleanup
if (Test-Path $CsvPath) {
$cutoff = (Get-Date).AddHours(-$RetainHours)
$all = Import-Csv $CsvPath
$kept = $all | Where-Object { [datetime]$_.Timestamp -gt $cutoff }
$kept | Export-Csv -Path $CsvPath -NoTypeInformation
}

Register as a scheduled task to run every 5 minutes:

$trigger   = New-ScheduledTaskTrigger -RepetitionInterval (New-TimeSpan -Minutes 5) -Once -At (Get-Date)
$action = New-ScheduledTaskAction -Execute "PowerShell.exe" `
-Argument "-NonInteractive -ExecutionPolicy Bypass -File `"C:\Scripts\CollectConnections.ps1`""
$settings = New-ScheduledTaskSettingsSet -ExecutionTimeLimit (New-TimeSpan -Minutes 4)
$principal = New-ScheduledTaskPrincipal -UserId "SYSTEM" -LogonType ServiceAccount -RunLevel Highest
Register-ScheduledTask -TaskName "NetworkConnectionMonitor" `
-Trigger $trigger -Action $action -Settings $settings -Principal $principal

The -Once -At with -RepetitionInterval pattern is how you create an indefinitely repeating task in PowerShell. The execution time limit (4 minutes) is deliberately shorter than the repeat interval (5 minutes) to prevent overlapping instances.

Log Analytics data extraction via monthly batches

Invoke-AzOperationalInsightsQuery has a 10-minute timeout and a 500,000-row result cap. For large datasets, I batch by month:

param(
[string]$WorkspaceId,
[string]$SubscriptionId,
[string]$StartDate = "2024-01-01",
[int]$MonthsToExtract = 6,
[string]$OutputFolder = "C:\LAExport"
)

Connect-AzAccount
Select-AzSubscription -SubscriptionId $SubscriptionId

New-Item -ItemType Directory -Force -Path $OutputFolder | Out-Null

for ($i = 0; $i -lt $MonthsToExtract; $i++) {
$start = (Get-Date $StartDate).AddMonths($i)
$end = $start.AddMonths(1).AddDays(-1)
$outFile = "$OutputFolder\data_$($start.ToString('yyyy-MM')).csv"
$query = @"
AzureMigrateDependencies_v1_CL
| where TimeGenerated between (datetime($($start.ToString('yyyy-MM-dd'))) .. datetime($($end.ToString('yyyy-MM-dd'))))
| where isnotempty(SourceServerName) and isnotempty(DestinationServerName)
| project TimeGenerated, SourceServerName, DestinationServerName, DestinationPort, SourceApplication
"@
Write-Host "Extracting $($start.ToString('yyyy-MM'))..."
$result = Invoke-AzOperationalInsightsQuery -WorkspaceId $WorkspaceId -Query $query
$result.Results | Export-Csv -Path $outFile -NoTypeInformation
Write-Host " Wrote $($result.Results.Count) rows to $outFile"
}

If you hit the row limit, reduce the date range further or add more filters to the query to narrow results before extraction.

DNS record export from Windows DNS

For hybrid network migrations, I export all A and CNAME records from on-premises DNS to understand what resolves to what before repointing to Azure Private DNS zones:

$ForwardZones = Get-DnsServerZone | Where-Object {
$_.ZoneType -in "Primary", "Secondary" -and $_.IsAutoCreated -eq $false
}

$Results = foreach ($Zone in $ForwardZones) {
$ZoneName = $Zone.ZoneName
$ARecords = Get-DnsServerResourceRecord -ZoneName $ZoneName -RRType "A" -ErrorAction SilentlyContinue
$CNAMEs = Get-DnsServerResourceRecord -ZoneName $ZoneName -RRType "CNAME" -ErrorAction SilentlyContinue

foreach ($r in $ARecords) {
[PSCustomObject]@{
Zone = $ZoneName
Name = $r.HostName
Type = "A"
Value = $r.RecordData.IPv4Address.ToString()
TTL = $r.TimeToLive.TotalSeconds
}
}
foreach ($r in $CNAMEs) {
[PSCustomObject]@{
Zone = $ZoneName
Name = $r.HostName
Type = "CNAME"
Value = $r.RecordData.HostNameAlias
TTL = $r.TimeToLive.TotalSeconds
}
}
}

$Results | Export-Csv -Path "C:\dns-export-$(Get-Date -Format 'yyyyMMdd').csv" -NoTypeInformation
Write-Host "Exported $($Results.Count) records"

Run this on the DNS server itself, or on a machine with RSAT DNS Tools installed. Reverse lookup zones are excluded by default (via the IsAutoCreated -eq $false filter on the system-generated reverse zones).

IIS web.config connection string extraction

For application dependency analysis before a database migration, I extract connection strings from all IIS-hosted applications:

Import-Module WebAdministration

$Results = foreach ($Site in (Get-Website)) {
foreach ($App in (Get-WebApplication -Site $Site.Name)) {
$ConfigPath = Join-Path $App.PhysicalPath "web.config"
if (Test-Path $ConfigPath) {
try {
[xml]$Config = Get-Content $ConfigPath -Raw
foreach ($cs in $Config.configuration.connectionStrings.add) {
[PSCustomObject]@{
Site = $Site.Name
App = $App.Path
ConnectionName = $cs.name
ConnectionString = $cs.connectionString
ProviderName = $cs.providerName
}
}
} catch {
Write-Warning "Could not parse $ConfigPath : $_"
}
}
}
}

$Results | Export-Csv "C:\iis-connection-strings.csv" -NoTypeInformation

If connection strings are encrypted, decrypt them first with aspnet_regiis -pdf "connectionStrings" <appPath>. Check the machine-level web.config too — strings can be inherited from C:\Windows\Microsoft.NET\Framework64\<version>\CONFIG\web.config.

Python patterns

Log Analytics ingestion via the Logs Ingestion API

When I need to push CSV data into a custom Log Analytics table (e.g., migration wave planning data, CMDB exports, third-party tool output), I use the Azure Monitor Logs Ingestion API:

from azure.monitor.ingestion import LogsIngestionClient
from azure.identity import DefaultAzureCredential
import csv

credential = DefaultAzureCredential()
client = LogsIngestionClient(
endpoint="https://<dce-name>.<region>.ingest.monitor.azure.com",
credential=credential
)

def upload_csv(csv_path: str, dcr_immutable_id: str, stream_name: str) -> None:
rows = []
with open(csv_path, newline="", encoding="utf-8") as f:
for row in csv.DictReader(f):
rows.append(row)

batch_size = 1000
for i in range(0, len(rows), batch_size):
client.upload(
rule_id=dcr_immutable_id,
stream_name=stream_name,
logs=rows[i : i + batch_size]
)
print(f"Uploaded {len(rows)} rows")

The endpoint is the Data Collection Endpoint (DCE) URL — not the Log Analytics workspace URL. The DCR (Data Collection Rule) must have a streamDeclaration that matches the CSV column names. Column names in the DCR are case-sensitive.

The hybrid collect-then-analyze pattern

The pattern that recurs most often in migration and operations work:

  1. Collect with PowerShell — WinRM remoting, netstat, registry queries, IIS/Web.config parsing, DNS exports
  2. Extract to CSV or Log Analytics — PowerShell Export-Csv or the Python ingestion client above
  3. Analyze with KQL or Python — connection relationship analysis, port summarization, dependency grouping

Each layer does what it's best at. PowerShell reaches into Windows internals. Python handles data transformation and API integration. Log Analytics provides the queryable, long-term store.