AWS SAM Regional API Gateways

The AWS Serverless Application Model (SAM) now supports REGIONAL API Gateways, just add EndpointConfiguration: ‘REGIONAL’ to your AWS::Serverless::Api object:

ApiGateway:
  Type: AWS::Serverless::Api
    Properties:
      DefinitionUri: apigateway.yaml
      StageName: api
      EndpointConfiguration: 'REGIONAL'

This is currently an undocumented feature but in my testing is works fine.

Encrypting Files using PowerShell and AWS KMS

In my last post I explained how to encrypt secrets e.g. passwords, using AWS Key Management Service (KMS) so you could store encrypted passwords securely in config files.  The next obvious question which I have been asked is “Can you encrypt whole files using KMS?”.

The first answer to this question is yes, very easily if you store them in S3; however your use case may not allow you to store your files in S3 and you need a way to store the files encrypted using KMS but not in S3, this is where a method call “Envelope Encryption” can be used.

Envelope encryption works by using a combination of two keys, your master KMS key and a randomly generated Data key that the KMS service creates on-demand for you.  When requesting a Data key from KMS, you are supplied with both an unencrypted and an encrypted (with the KMS master-key) version of the Data key.

The Data key is a symmetric encryption key that is used to encrypt files locally on your machine, symmetric encryption is very fast but as the same key is used to encrypt and decrypt your data you need to protect the symmetric key somehow; in this case the Data key is protected by encrypting it with the Master key.  Once you have encrypted the local file with the unencrypted Data key, you throw it away and store the encrypted data key with the encrypted file.

To decrypt the file, it is a two-step process use: KMS to decrypt the encrypted Data key and then use the decrypted Data key to decrypt the file.  There is a detailed explanation of how envelope encryption works here.

Below are the two functions required to encrypt and decrypt a file using KMS envelope encryption.  I been able to use them to encrypt a file, confirmed the contents is unreadable, and then decypt the file to get it back to it’s original state.

Function to Encrypt

function Invoke-KMSEncryptFile
(
[Parameter(Mandatory=$true,Position=1,HelpMessage='PlainText to Encrypt')]
[ValidateScript({(Test-Path $_) -eq $true})]
[string]$filePath,
[Parameter(Mandatory=$true,Position=2,HelpMessage='GUID of Encryption Key in KMS')]
[string]$keyID,
[Parameter(Mandatory=$true,Position=3)]
[string]$region,
[Parameter(Position=4)]
[string]$AccessKey,
[Parameter(Position=5)]
[string]$SecretKey,
[Parameter(Position=6,HelpMessage='Name of output file')]
[ValidateScript({(Test-Path $_) -eq $false})]
[string]$outPath,
[Parameter(Position=7,HelpMessage='Encryption Strength')]
[ValidateSet('AES_128','AES_256')]
[string]$KeySpec = 'AES_128'
)
{
# Unecrypted Byte Array
$byteArray = [System.IO.File]::ReadAllBytes($filePath)

# memory stream for unencrypted file
$memoryStream = New-Object System.IO.MemoryStream

# splat
$splat = @{keyID=$keyID; Region=$Region; KeySpec=$KeySpec}
if(![string]::IsNullOrEmpty($AccessKey)){$splat += @{AccessKey=$AccessKey;}}
if(![string]::IsNullOrEmpty($SecretKey)){$splat += @{SecretKey=$SecretKey;}}
# Get KMS DataKey
try {
$DataKey = New-KMSDataKey @splat
}
catch {
throw "Failed to get DataKey from KMS key"
}
$encryptedDataKey = $DataKey.CiphertextBlob.ToArray()

# Symetric encryption of file
$cryptor = New-Object -TypeName System.Security.Cryptography.AesManaged
$cryptor.Mode = [System.Security.Cryptography.CipherMode]::CBC
$cryptor.Padding = [System.Security.Cryptography.PaddingMode]::PKCS7
$cryptor.KeySize = 128
$cryptor.BlockSize = 128

$iv = $cryptor.IV

$cs = New-Object System.Security.Cryptography.CryptoStream($memoryStream, $cryptor.CreateEncryptor($DataKey.PlainText.ToArray(),$iv), [System.Security.Cryptography.CryptoStreamMode]::Write)
$cs.Write($byteArray,0,$byteArray.Length)
$cs.FlushFinalBlock()
$encryptedContent = $memoryStream.ToArray()

# Create new byte array that should contain both unencrypted iv, encrypted data, encrypted KMS key
$result = New-Object Byte[] ($encryptedDataKey.length + $iv.Length + $encryptedContent.Length)

# copy Data Key, IV and encrypted file arrays into single array
$currentPosition = 0

[System.Buffer]::BlockCopy($encryptedDataKey, 0, $result, $currentPosition, $encryptedDataKey.Length)
$currentPosition += $encryptedDataKey.Length

[System.Buffer]::BlockCopy($iv, 0, $result, $currentPosition, $iv.Length)
$currentPosition += $iv.Length

[System.Buffer]::BlockCopy($encryptedContent, 0, $result, $currentPosition, $encryptedContent.Length)
$currentPosition += $encryptedContent.Length

if ([string]::IsNullOrEmpty($outPath))
{
$finalOutPath = $filePath + '.enc'
}
else
{
$finalOutPath = $outPath
}

# Write bytes to file
[System.IO.File]::WriteAllBytes($finalOutPath,$result)

return $finalOutPath;
}

Function to Decrypt

function Invoke-KMSDecryptFile
(
 [Parameter(Mandatory=$true,Position=1,HelpMessage='File to decrypt')]
 [ValidateScript({(Test-Path $_) -eq $true})]
 [string]$filePath,
 [Parameter(Mandatory=$true,Position=2)]
 [string]$region,
 [Parameter(Position=3)]
 [string]$AccessKey,
 [Parameter(Position=4)]
 [string]$SecretKey,
 [Parameter(Position=5,HelpMessage='Name of output file')]
 [ValidateScript({(Test-Path $_) -eq $false})]
 [string]$outPath,
 [Parameter(Position=6,HelpMessage='Encryption Strength')]
 [ValidateSet('AES_128','AES_256')]
 [string]$KeySpec = 'AES_128'
)
{
 # read encrypted file to Byte Array
 $cipherTextArray = [System.IO.File]::ReadAllBytes($filePath)
# setup byte arrays
 $DataKeyLength = switch ($KeySpec)
 {
 'AES_128' {151}
 'AES_256' {167}
 }
 $encryptedDataKey = New-Object Byte[] $DataKeyLength
 $iv = New-Object Byte[] 16
 $encryptedContent = New-Object Byte[] ($cipherTextArray.length - ($DataKeyLength + 16))

 # split file array into 3 to retrieve encrypted Key, IV and Data
 $currentPosition = 0

 [System.Buffer]::BlockCopy($cipherTextArray, $currentPosition, $encryptedDataKey, 0, $encryptedDataKey.length)
 $currentPosition += $encryptedDataKey.Length

 [System.Buffer]::BlockCopy($cipherTextArray, $currentPosition, $iv, 0, $iv.length)
 $currentPosition += $iv.Length

 [System.Buffer]::BlockCopy($cipherTextArray, $currentPosition, $encryptedContent, 0, $encryptedContent.Length)
 $currentPosition += $encryptedContent.Length

 # memory stream for Datakey
 $encryptedMemoryStreamToDecrypt = New-Object System.IO.MemoryStream($encryptedDataKey,0,$encryptedDataKey.Length)

 # splat
 $splat = @{CiphertextBlob=$encryptedMemoryStreamToDecrypt; Region=$Region;}
 if(![string]::IsNullOrEmpty($AccessKey)){$splat += @{AccessKey=$AccessKey;}}
 if(![string]::IsNullOrEmpty($SecretKey)){$splat += @{SecretKey=$SecretKey;}}

 # decrypt key
 try {
 $decryptedMemoryStream = Invoke-KMSDecrypt @splat
 }
 catch {
 throw "Failed to decrypt KMS key"
 }
 $DataKey = $decryptedMemoryStream.Plaintext.ToArray()

 # memory stream for file
 $memoryStream = New-Object System.IO.MemoryStream
# Symmetric decryption of file
 $cryptor = New-Object -TypeName System.Security.Cryptography.AesManaged
 $cryptor.Mode = [System.Security.Cryptography.CipherMode]::CBC
 $cryptor.Padding = [System.Security.Cryptography.PaddingMode]::PKCS7
 $cryptor.KeySize = 128
 $cryptor.BlockSize = 128

 $cs = New-Object System.Security.Cryptography.CryptoStream($memoryStream, $cryptor.CreateDecryptor($DataKey,$iv), [System.Security.Cryptography.CryptoStreamMode]::Write)
 $cs.Write($encryptedContent,0,$encryptedContent.Length)
 $cs.FlushFinalBlock()
 $decryptedArray = $memoryStream.ToArray()

 if ([string]::IsNullOrEmpty($outPath))
 {
 $finalOutPath = $filePath + '.dec'
 }
 else
 {
 $finalOutPath = $outPath
 }

 # Write bytes to file
 [System.IO.File]::WriteAllBytes($finalOutPath,$decryptedArray)

 Return $finalOutPath
}

Below is some sample code that makes use of the functions, simply fill in the locations of the files, the encryption strength (AES_128 or AES_256), the access/secret keys, the KMS Master key you want to use for encryption and the region where the key is stored.

Import-Module awspowershell
# set your credentials to access AWS, key you want to encrypt with, and the region the key is stored
$AccessKey = ''
$SecretKey = ''
$Region = 'eu-west-1'
$keyID = '' # GUID
$KeySpec = 'AES_128' # AES_128 or AES_256
# Encrypt File
$encryptedFile = Invoke-KMSEncryptFile -filePath "C:\temp\testfile.txt" -keyID $keyID -KeySpec $KeySpec -Region $Region -AccessKey $AccessKey -SecretKey $SecretKey
Write-Host $encryptedFile
# Decrypt File
$decryptedFile = Invoke-KMSDecryptFile -filePath "C:\temp\testfile.txt.enc" -outPath "C:\temp\testfile2.txt" -KeySpec $KeySpec -Region $Region -AccessKey $AccessKey -SecretKey $SecretKey
Write-Host $decryptedFile

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Encrypting Secrets using PowerShell and AWS KMS

AWS Key Management Service (KMS) is an Amazon managed service that makes it easy for you to create and control encryption keys that you can then use to encrypt data.  A lot of the AWS services natively integrate with KMS e.g. S3, but I wanted to use a KMS key to encrypt a secret (e.g. a password) that I could store inside a configuration file and decrypt it when required.

To do this I created a two PowerShell functions, one for encryption and one for decryption, that I can embed in scripts.  The encryption function will securely transfer your plaintext to KMS, KMS will encrypt the data and return an encrypted memory stream which I convert to a base64 string to make it easy to store in text/XML/JSON.

The decrypt function takes a previously encrypted base64 string, converts and sends it to KMS to decrypt (note you don’t have to tell KMS which key is required to decrypt) and KMS returns a plaintext memory stream which I convert back to a UTF8 encoded string.

I generally use these functions during userdata execution (boot time) on an AWS EC2 instance to decrypt secrets that I need to configure the instance and/or applications, but you could use this on any windows machine.  To support the use of IAM Roles on EC2 instances, I have made the access/secret key parameters optional i.e. if you don’t pass an access/secret key the function will attempt to use the privileges provided by the IAM role applied to the EC2 instance, assuming you are running the function on EC2.

Function to Encrypt

function Invoke-KMSEncryptText
(
	[Parameter(Mandatory=$true,Position=1,HelpMessage='PlainText to Encrypt')]
	[string]$plainText,
	[Parameter(Mandatory=$true,Position=2,HelpMessage='GUID of Encryption Key in KMS')]
	[string]$keyID,
	[Parameter(Mandatory=$true,Position=3)]
	[string]$region,
	[Parameter(Position=4)]
	[string]$AccessKey,
	[Parameter(Position=5)]
	[string]$SecretKey
)
{
	# memory stream
	[byte[]]$byteArray = [System.Text.Encoding]::UTF8.GetBytes($plainText)
	$memoryStream = New-Object System.IO.MemoryStream($byteArray,0,$byteArray.Length)
	# splat
	$splat = @{Plaintext=$memoryStream; KeyId=$keyID; Region=$Region;}
	if(![string]::IsNullOrEmpty($AccessKey)){$splat += @{AccessKey=$AccessKey;}}
	if(![string]::IsNullOrEmpty($SecretKey)){$splat += @{SecretKey=$SecretKey;}}
	# encrypt
	$encryptedMemoryStream = Invoke-KMSEncrypt @splat
	$base64encrypted = [System.Convert]::ToBase64String($encryptedMemoryStream.CiphertextBlob.ToArray())
	return $base64encrypted
}

Function to Decrypt

function Invoke-KMSDecryptText
(
	[Parameter(Mandatory=$true,Position=1,HelpMessage='CipherText base64 string to decrypt')]
	[string]$cipherText,
	[Parameter(Mandatory=$true,Position=2)]
	[string]$region,
	[Parameter(Position=3)]
	[string]$AccessKey,
	[Parameter(Position=4)]
	[string]$SecretKey
)
{
	# memory stream
	$encryptedBytes = [System.Convert]::FromBase64String($cipherText)
	$encryptedMemoryStreamToDecrypt = New-Object System.IO.MemoryStream($encryptedBytes,0,$encryptedBytes.Length)
	# splat
	$splat = @{CiphertextBlob=$encryptedMemoryStreamToDecrypt; Region=$Region;}
	if(![string]::IsNullOrEmpty($AccessKey)){$splat += @{AccessKey=$AccessKey;}}
	if(![string]::IsNullOrEmpty($SecretKey)){$splat += @{SecretKey=$SecretKey;}}
	# decrypt
	$decryptedMemoryStream = Invoke-KMSDecrypt @splat
	$plainText = [System.Text.Encoding]::UTF8.GetString($decryptedMemoryStream.Plaintext.ToArray())
	return $plainText
}

Below is some sample code that makes use of the functions, simply fill in the access/secret keys, the KMS Master key you want to use for encryption and the region where the key is stored.  Obviously you should consider handling your plaintext more securely than I am here, but this serves as a simple test.

Import-Module awspowershell
# set your credentials to access AWS, key you want to encrypt with, and the region the key is stored
$AccessKey = ''
$SecretKey = ''
$Region = 'eu-west-1'
$keyID = ''
$plainText = 'Secret'

# Encrypt some plain text and write to host
$cipherText = Invoke-KMSEncryptText -plainText $plainText -keyID $keyID -Region $Region -AccessKey $AccessKey -SecretKey $SecretKey
Write-host $cipherText

# Decrypt the cipher text and write to host
$plainText = Invoke-KMSDecryptText -cipherText $cipherText -Region $Region -AccessKey $AccessKey -SecretKey $SecretKey
Write-host $plainText

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

PowerShell and Twilio: SMS

Twilio is a cloud based messaging service; it can do everything from sending SMS’s, to being the basis of an entirely cloud based virtual call centre.  It’s pretty powerful stuff.

My requirements are fairly basic though, I just want to be able to send SMS’s from a PowerShell script to engineers to alert them when things go wrong.  The first thing to do is head over to Twilio and set yourself a test account up, which is free, and will allow you to send messages to yourself.

All interaction with Twilio is via their REST API and the first method I have to interact with it is using the official Twilio C# DLLs, the instructions to download are here. Once you have the DLLs, pop them in the same directory as the script you are going to be running and here’s a function to make use of them along with a sample call:

function invoke-twilioSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken, [Parameter(Mandatory=$true)][String]$dllPath ) { Add-Type -path "$dllPath\RestSharp.dll" Add-Type -path "$dllPath\Twilio.Api.dll" $twilio = new-object Twilio.TwilioRestClient($AccountSid,$authToken) $msg = $twilio.SendSmsMessage($fromTel, $toTel, $message) } invoke-twilioSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>" ` -dllPath "<scriptPath>"

The problem with this method is that it’s awkward to get hold of the DLLs, and I find there is something clunky about having to use DLLs to call a REST API.  So in method 2, I make use of the Invoke-RestMethod (which arrived in PowerShell 3.0) to talk to the REST API directly.

function invoke-twilioRESTSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken ) { # Build a URI $URI = "https://api.twilio.com/2010-04-01/Accounts/$AccountSid/SMS/Messages.json" # encode authorization header $secureAuthToken = ConvertTo-SecureString $authToken -AsPlainText -Force $credential = New-Object System.Management.Automation.PSCredential($AccountSid,$secureAuthToken) # content $postData = "From=$fromTel&To=$toTel&Body=$message" # Fire Request $msg = Invoke-RestMethod -Uri $URI -Body $postData -Credential $credential -Method "POST" -ContentType "application/x-www-form-urlencoded" } invoke-twilioRESTSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>"

Method 2 is my preferred why of doing things for something as simple as sending a SMS.

There is still one further way to send a message using PowerShell and Twilio, and this addresses those who are using PowerShell 2.0, so can’t use Invoke-RestMethod, and don’t want to use the DLLs; we can still build a request from scratch using the System.Net.WebRequest object:

function post-twilioSMS ( [Parameter(Mandatory=$true)][String]$AccountSid, [Parameter(Mandatory=$true)][String]$message, [Parameter(Mandatory=$true)][String]$fromTel, [Parameter(Mandatory=$true)][String]$toTel, [Parameter(Mandatory=$true)][String]$authToken ) { # Build a URI $URI = "https://api.twilio.com/2010-04-01/Accounts/$AccountSid/SMS/Messages.json" $requestUri = new-object Uri ($URI) # Create the request and specify attributes of the request. $request = [System.Net.WebRequest]::Create($requestUri) # encode authorization header $authText = $AccountSid + ":" + $authToken $authUTF8 = [System.Text.Encoding]::UTF8.GetBytes($authText) $auth64 = [System.Convert]::ToBase64String($authUTF8) # Define the requred headers $request.Method = "POST" $request.Headers.Add("Authorization: Basic $auth64"); $request.Accept = "application/json, application/xml, text/json, text/x-json, text/javascript, text/xml" $request.ContentType = "application/x-www-form-urlencoded" # content $fromTel = [System.Web.HttpUtility]::UrlEncode($fromTel) $toTel = [System.Web.HttpUtility]::UrlEncode($toTel) $message = [System.Web.HttpUtility]::UrlEncode($message) $postData = "From=$fromTel&To=$toTel&Body=$message" $request.ContentLength = $postData.Length # Stream Bytes $postBytes = [System.Text.Encoding]::ascii.GetBytes($postData) $requestStream = $request.GetRequestStream() $requestStream.Write($postBytes, 0,$postBytes.length) $requestStream.flush() $requestStream.Close() # Fire Request $response = $request.GetResponse() # Output Response $responseStream = $response.GetResponseStream() $responseReader = New-Object System.IO.StreamReader $responseStream $returnedResponse = $responseReader.ReadToEnd() $response.close() } post-twilioSMS -AccountSid "<AccountSid>" ` -authToken "<authToken>" -message "<message>" ` -fromTel "<fromTel"> -toTel "<toTel>" ` -dllPath "<scriptPath>"

Method 3 works just fine, its lacks the simplicity of Method 2, but it does give you very granular control over what the Web Request is doing; which I have found very useful when working with other REST API’s which aren’t quite as well behaved as Twilio.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Filtered Azure Blob to Blob Copy

I recently had the job of copying ten’s of thousands of IIS log files, each one at least 100MB, from one Azure Storage account to another.  Using something simple like CloudBerry to copy the file just wasn’t going to cut it as it copies the file first to the local client, then pushes it back into Azure, not efficient at all.

A quick bit of digging and I discovered that the Azure PowerShell cmdlet Start-AzureStorageBlobCopy allows you to trigger a copy Azure to Azure, which runs very quickly, it will even allow you to copy an entire container from one storage account to another; what it won’t allow you to do is pass a filter in so only copies files matching the filter.

So here’s a function that I wrote to get that functionality, with some progress bars and timers for added effect 🙂

Function Start-AzureStorageBlobContainerCopy ( [Parameter(Mandatory=$true)][String]$srcStorageAccountName, [Parameter(Mandatory=$true)][String]$destStorageAccountName, [Parameter(Mandatory=$true)][String]$SrcStorageAccountKey, [Parameter(Mandatory=$true)][String]$DestStorageAccountKey, [Parameter(Mandatory=$true)][String]$SrcContainer, [Parameter(Mandatory=$true)][String]$DestContainer, [String]$filter = "" ) { Import-Module Azure $srcContext = New-AzureStorageContext -StorageAccountName $srcStorageAccountName -StorageAccountKey $SrcStorageAccountKey $destContext = New-AzureStorageContext -StorageAccountName $destStorageAccountName -StorageAccountKey $DestStorageAccountKey $timeTaken = measure-command{ if ($filter -ne "") { $blobs = Get-AzureStorageBlob -Container $SrcContainer -Context $srcContext | ? {$_.name -match $filter} } else { $blobs = Get-AzureStorageBlob -Container $SrcContainer -Context $srcContext } } Write-host "Total Time to index $timeTaken" -BackgroundColor Black -ForegroundColor Green $i = 0 $timeTaken = measure-command{ foreach ($blob in $blobs) { $i++ Write-Progress -Activity:"Copying..." -Status:"Copied $i of $($blobs.Count) : $($percentComplete)%" -PercentComplete:$percentComplete $copyInfo = Start-AzureStorageBlobCopy -ICloudBlob $blob.ICloudBlob -Context $srcContext -DestContainer $DestContainer -DestContext $destContext -Force Write-host (get-date) $copyInfo.name } } write-host Write-host "Total Time $timeTaken" -BackgroundColor Black -ForegroundColor Green } Start-AzureStorageBlobContainerCopy -srcStorageAccountName "<src Storage>" -SrcStorageAccountKey "<src key>" -SrcContainer "<src Container>" ` -destStorageAccountName "<dest Storage>" -DestStorageAccountKey "<dest key>" -DestContainer "<dest Container>" ` -filter "<filter>"

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Switching garbage collection on in an Azure Worker role

Whilst working on an issue with Microsoft on one of our production environments, we came across the fact that an Azure Worker role, by default, has it’s garbage collection set to workstation and not server mode.  If you are using medium or larger (hence multi-processor) you could see a performance benefit by switching to server mode.

Unfortunately the Azure tooling does not currently allow you to directly configure this setting, so you have to do it in a round about fashion, by creating a startup task that will perform changes as the instance boots.

First define a start up task in the Service Definition of your worker role:

<WorkerRole name="WorkerRole1" vmsize="Medium"> <Startup> <Task commandLine="startup.cmd" executionContext="elevated" taskType="simple" /> </Startup>

Now create a “startup.cmd” in the root of your worker that will be used to kick off the powershell that will modify the config file

@echo off powershell -command "Set-ExecutionPolicy RemoteSigned" powershell .\setServerGC.ps1 2>> err.out

And finally create the “setServerGC.ps1” file in the root of your worker role, this is the file that will actually make the modifications.

# Load up the XML $configFile = "${env:RoleRoot}\base\x64\WaWorkerHost.exe.config" [xml]$waXML = Get-Content $configFile if (($waXML.configuration.runtime.gcServer -eq $null) -and ($waXML.configuration.runtime.gcConcurrent -eq $null)) { # Modify XML $gcServerEl = $waXML.CreateElement('gcServer') $gcConcurrentrEl = $waXML.CreateElement('gcConcurrent') $gcServerAtt = $waXML.CreateAttribute("enabled") $gcServerAtt.Value = "true" $gcConcurrentrAtt = $waXML.CreateAttribute("enabled") $gcConcurrentrAtt.Value = "true" $gcServerEl.Attributes.Append($gcServerAtt) | Out-Null $gcConcurrentrEl.Attributes.Append($gcConcurrentrAtt) | Out-Null $waXML.configuration.runtime.appendChild($gcServerEl) | Out-Null $waXML.configuration.runtime.appendChild($gcConcurrentrEl) | Out-Null $waXML.Save($configFile) # Restart WaWorkerHost.Exe Get-Process | ? {$_.name -match "WaHostBootstrapper"} | Stop-Process -Force Get-Process | ? {$_.name -match "WaWorkerHost"} | Stop-Process -Force }

We saw a significant performance boost on the role we deployed this on, but your mileage will vary depending on your workload.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

Simple Azure Storage Queue Monitor

If you need to monitor the length of a queue in Azure, you can use the Azure PowerShell CmdLets to help you out.

Below is a sample ticker script that uses the Azure CmdLets (so make sure you have them installed) and it polls the configured queue every 10 seconds. 

clear Import-Module Azure $cert = Get-Item cert:\currentuser\my\<cert thumprint> # management cert $subID = "<subscription ID>" # Subcription ID $storageAccount = "<storage account>" # storage account where queue lives $queueName = "<queueName>" # Queue you're interested in $interval = 10 # Time between ticks Set-AzureSubscription -SubscriptionID $subid -Certificate $cert ` -SubscriptionName "CurrentSubscription" ` -CurrentStorageAccount $storageAccount Select-AzureSubscription -SubscriptionName "CurrentSubscription" # do forever loop do { # measure how long it takes to run the command $timeTaken = Measure-Command{ # get the queue info $queueInfo1 = Get-AzureStorageQueue -Name $queueName # write it to screen Write-Host (Get-Date) $queueInfo1.ApproximateMessageCount } # take the time take to run command off the interval time $totalTimeToWait = New-TimeSpan -Seconds $interval $timeToWait = $totalTimeToWait - $timeTaken # go to sleep sleep ($timeToWait.TotalSeconds) }while($true)

Parse IIS log files with PowerShell

I recently got asked if there was an easy way to find out the average time-taken for an IIS instance to complete a request.  This information is available in the IIS log files, you just need to parse it out.  Now there are many IIS log parsers available on the internet, but I thought “I wonder how easily I could do that in PowerShell”; it turns out very easily!

First thing is to define the path to you log file

$IISLogPath = "C:\Temp\sample.log"

Next we need to find the headers that are available in this particular log file.  First load the file, pick the headers out (always on the 4th line) using ‘split’ to separate the headers delimitated by a white space, and then get rid of the “#Fields: “ prefix from the headers.

Note I’ve used  [System.IO.File]::ReadAllLines to load the file as it’s a lot faster than get-content, this makes a big difference if your iterating through a lot of files!

$IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath) $headers = $IISLogFileRaw[3].split(" ") $headers = $headers | where {$_ -ne "#Fields:"}

Now we need to actually import the file, which is nice and simple, as we’ve already got the headers we can just use import-csv to do the work for us, and then do a little bit of clean up removing any comment lines which start with a #.

$IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath $IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"}

Finally lets collect all the time-taken values into an array.  Note I have had a to do a little bit of hoop jumping to get the “-“ to not be interpreted by PowerShell.

$timeTaken = $IISLogFileCSV | foreach {$_.$("time-taken")}

So putting it all together and we get this:

$IISLogPath = "C:\Temp\sample.log" $IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath) $headers = $IISLogFileRaw[3].split(" ") $headers = $headers | where {$_ -ne "#Fields:"} $IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath $IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"} $timeTaken = $IISLogFileCSV | foreach {$_.$("time-taken")}

Once you’ve got the array of time-taken, you can ask questions like “what was the average and max time-taken”, but I’ll leave that bit up to you”!

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

DAC SQL Azure Import Export Service PowerShell Client Module

SQL Azure offers a hosted service to import/export Databases between SQL Azure and Azure Blob storage, essentially they have put up a REST API and you can fire commands at it.  There is even a Codeplex project with SQL DAC example client  implementations. 

When I recently attempted to automate exports of a number of databases we host in Azure I grabbed a copy of the client and wrapped it in PowerShell and thought job done. That’s where I ran into issue number one, the exe randomly hangs for me. 

C# is not one of my strong points, so I decided that attempting to debug the C# source probably wasn’t a good idea and instead I decided to re-implement the client in PowerShell. 

So a lot of coffee, detective work (the REST API isn’t very well documented currently) and digging around in the example source code I’ve put together a PowerShell module implementing the three main features.  Export, Import, Status.

Usage

I’ve attempted to keep the command switches as close as I could to the Codeplex project so if you’re switching from one to the other, you should be able to figure out what’s going on very quickly. 

start-DacExport -s <server> -d <database> -u <username> -p <password> -bloburl <bloburl> -blobaccesskey <key> start-DacImport -s <server> -d <database> -u <username> -p <password> -bloburl <bloburl> -blobaccesskey <key> -size <inGB> -edition <web/business> get-DacJobStatus -s <server> -u <username> -p <password> [-requestid <GUID> -withInfo]

Both start-DacExport and start-DacImport will return the GUID of the job, which you can then use with get-DacJobStatus.  get-DacJobStatus will return an xml object containing then job’s status information; this is great if you are using the function in your own script, but if you just want to print the results to screen make sure you use –withInfo and the XML will be sent to the console instead.

Installation Instructions

  • Create a new folder in called DacIESvcPS in your modules directory e.g C:\Windows\System32\WindowsPowerShell\v1.0\Modules\DacIESvcPS
  • Download the latest version of the PSM1 file from https://github.com/stevenaskwith/DacIESvcPS into the new directory
  • Launch a PowerShell console and run
    • import-module DacIESvcPS
  • To confirm module loaded correctly run
    • Get-Command -Module DacIESvcPS

You should get something like this:

image

An example output of get-DacJobStatus –withInfo would look like this:

db3prod-dacsvc.azure.com <?xml version="1.0" encoding="ibm850"?> <ArrayOfStatusInfo xmlns="http://schemas.datacontract.org/2004/07/Microsoft.SqlServer.Management.Dac.ServiceTypes" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"> <StatusInfo> <BlobUri>http://myExportBlob.blob.core.windows.net/sqlexports/someDatabase.bacpac</BlobUri> <DatabaseName>someDatabase</DatabaseName> <ErrorMessage /> <LastModifiedTime>2012-03-22T10:18:57.1864719Z</LastModifiedTime> <QueuedTime>2012-03-22T10:16:03.7488387Z</QueuedTime> <RequestId>2bbbf314-3ec5-4f7c-afbd-ba219a61954b</RequestId> <RequestType>Import</RequestType> <ServerName>eccaps1fj1.database.windows.net</ServerName> <Status>Completed</Status> </StatusInfo> </ArrayOfStatusInfo>

I would love to hear back from anyone who uses this in the field.

THIS POSTING AND CODE RELATED TO IT ARE PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK

What Focal Lengths do I take photos at? Part 2 – Charting

So in my previous post I showed how you can leverage PowerShell and the .NET framework to analyse your photo collection and get some meaningful statistics from them, in my example figuring out the frequency distribution of the focal lengths at which you take your photographs. 

The results are presented in a simple hash table, which does the job but doesn’t make it very easy to visualise the results, presenting the data in a chart would make the output a lot easier to understand.  You could just copy past the output into Excel and create a chart from the data, but I thought there must be a way to do this within PowerShell; so after a quick dig and a very helpful blog post later this is what I came up with:

Changes

Once your script grows over a 100 lines it starts to get really hard to keep track of what’s going on, so whilst functions are great at wrapping up bits of repeatable and reusable code, I find they are good for breaking your script down into easy to manage sections of code.  Finally I coordinate the script via control logic in the ‘main’ at the end of the script, which is around 15 line of code including comments, but very readable!

The two functions (and parameters) I have added are:

  • get-FocalLengths <files>
    • returns the frequency distribution in a hashtable
  • createChart <hashtable of values> <title> <x-axis title> <y-axis title>
    • Takes a hashtable (which doesn’t have to be focal lengths) and creates a chart, prints it to screen and writes it to a file

Part of the createChart function requires that you load some additional .NET assemblies, these are not part of the standard .NET 3.5 SP1 package, but are a Microsoft add-on, if you don’t have them installed on your machine when you run the code we need to handle that error.

When the script loads assembly the output of that operation is captured in the $assembly variable, if $assembly equals null, then nothing was loaded i.e. you don’t have the “System.Windows.Forms.DataVisualization” assembly on you pc.  In this case my error handling code launches an internet explorer to the download page for the assembly and exits the script.

$assembly = [Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms.DataVisualization") if($assembly -eq $null) { Write-Host "Charting module not installed, please install it" # launch IE and navigate to the correct page $ie = New-Object -ComObject InternetExplorer.Application $ie.Navigate("http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=14422") $ie.Visible = $true break }

Source

I’ve moved the source code for this project into GitHub and you can down load the source code here:

https://github.com/stevenaskwith/Focal-Lengths/blob/master/get-FocalLengths.ps1 click the “RAW” button and save as to grab a copy of the ps1.

Usage

Call the script in the same way as the previous version:

.\get-Focallengths –Path “c:\images” –fileType “.png” –model “Canon EOS 7D”

or drop the parameters you don’t need and let the script use it’s defaults e.g.

.\get-Focallengths –Path “c:\images”

Summary

Firing the script against my personal photo collection resulted in a nice graph shown below.  It’s quite obvious to see now that I either take my zoomed all the way in or out.  Now I’ve just got to pick where to spend my money…

THIS POSTING IS PROVIDED “AS IS” AND INFERS NO WARRANTIES OR RIGHTS, USE AT YOUR OWN RISK