After updating a docker container running on AWS ECS / Fargate, I encountered some erroneous characters in the cloudwatch logs output of the ECS task. When I originally created the dockerfile, the latest version of PowerShell being installed from the Microsoft repo was version 7. The latest container update was running PowerShell 7.2. The amazonlinux container was also updated to the latest version so one of these recent updates within the container had added some leading and trailing characters to the logging output.
Google came up totally empty for me with this one so I am posting this to help out anyone who ends up in the same boat.
The characters showing up in the cloudwatch logs looked like this for the verbose output stream:
[33;1m VERBOSE: Output [0m
And the characters in the standard output stream looked like this:
[32;1m Regular StdOut Output [0m
This line of code added to the top of the script removed all the erroneous characters from the log output.
The other day I needed to launch some one-off Fargate / ECS container tasks with PowerShell. The documentation covered most of what I needed but I could not find any examples on how to override environment variables sent to the container task.
I only needed to change one environment variable so creating a whole new task definition seemed overkill for this purpose. After some trial and error the below code helped me get the job done.
If this helped you out please let me know in the comments. Feedback will motivate me to share solutions on this site more often.
One of the things that tripped me up early on while learning PowerShell was working with objects. Like most sysadmins I approached learning PowerShell from a scripting mindset. I wanted to run a script and have the script complete a routine task. I thought about PowerShell as a purely procedural language and I mostly ignored objects.
A great characteristic of PowerShell is just how easy it is to get started. You can still get tons of tasks done in PowerShell without a solid grip on objects. But to make progress into the language and get into what most consider intermediate level knowledge there is a need to gain a solid understanding of objects. How does someone without a programming or developer background get to a solid understanding of objects? I mean there are so many kinds of objects. Its not like there is an arbitrary number of times you pass objects over the pipeline before you have them all mastered. No, the best way get comfortable working with objects is learning how to examine them.
Below are the methods I use the most when working with unfamiliar objects.
Method Number 1: IntelliSense in the PowerShell ISE.
This is my go to method of inspecting a simple object. This often is all that is needed to discover to the properties of the object I am after.
To demonstrate this I will use a practical example of getting the full path to file objects returned from filtering the results of Get-ChildItem. I can save the search resultsĀ to a variable $a.
With the variable saved I can call the variable with a trailing period and that will start IntelliSense exploration of the object.
This gets me right to the “FullName” property that contains the complete path.
Method Number 2: Get-Member
If using IntelliSense does not get me what I am after, I am probably looking at a more complex object. Maybe its an object that contains other objects. The best way I’ve found to further inspect a object beyond IntelliSense is to pipe it to Get-Member or its alias “gm”.
Again I will use a practical example from one of my posts a few weeks back about working with Amazon EC2 security groups. My goal in this example was to create security groups and their inbound/ingress rules. I know from experience that I want to understand Amazon’s objects when working with the AWS Tools for Windows PowerShell before using the cmdlets. So I set out to look at what kind of objects Amazon uses for its EC2 security groups. My first step was to look up a security group with the Get-EC2SecurityGroup cmdlet and save the returning object to a variable.
PowerShell
1
$sg=Get-EC2SecurityGroup-GroupIdsg-66ad9e19
With the object stored in the variable I tried IntelliSense and saw IpPermission properties that looked promising. I have a hunch that this property will reveal how security groups handle their network traffic permissions.
After choosing the property and entering it into the console, I can see it does contain what I am after but its not so straight forward. I see some “{}” in fields where I expect data. Port 80 and 22 match up with my ingress rules on the security group but there are no details on the source security group of the ingress rule.
This is a sign of a more complex object and its time to use Get-Member.
PowerShell
1
$sg.IpPermission|gm
Whoa, this object is quite complex. I can see that this IpPermission property that I am inspecting is an object type unique to itself. Its an Amazon.EC2.Model.IpPermission which is listed at the top of the Get-Member output. This IpPermission has its own set of properties. I think of these as “sub-properties” from the parent security group object. Looking at these “sub-properties” we see they are lists of other object types. Its only going to get more complex from here!
Next I backtrack a step and pipe $sg to Get-Member and see that it is an Amazon.EC2.Model.SecurityGroup.
PowerShell
1
$sg|gm
With the type of the object gained from using Get-Member, I can use a search engine to find my way to Amazon’s documentation of this object’s class. That’s another great source of information about this object. From this point further inspection is a matter of preference. I can keep using Get-Member on all the properties under IpPermission to learn about the object or I can look to Amazon’s documentation about the IpPermission class. Both of these options are valid but I prefer to keep using PowerShell. Continuing down this path of discovering sub properties and piping them to Get-Member might take a while so to save time I can move on to my final and new favorite method of object exploration.
Method Number 3: Show-Object
Show-Object is a great add-on to PowerShell. Its available from the PowerShell gallery as part of Lee Holmes’ PowerShellCookbook module. Its like Get-Member on steroids.
PowerShell
1
$sg|Show-Object
When you pipe an object to Show-Object it will display a tree view of the object in a GUI just like showwindow. You can use the popup window to click through all the properties of the object and discover more details about the object’s inheritance and structure. As you drill into the tree view the bottom pane of the window will update with familiar Get-Member results of each property.
A few clicks later inside the IpPermission property I see information about UserIdGroupPair and I’ve found my source security group allowed for ingress traffic. This is “sg-fda89b92” in the image above. It is in a form I did not initially expect. With all the information I have gained from these discovery methods its was only a matter of time before I had a great understanding of this previously completely unknown object type.
I wanted to look at connecting two disparate systems for a recent project. The goal was to be able to enter information into one system and have information processed by another system. The systems have no direct authentication trusts between them but they are both running on Amazon Web Services EC2 platform. This was a perfect use for the decoupling nature of the Amazon Simple Queue Service and I wanted to come up with a proof of concept, which is outlined below.
Before getting into any details, I want to make clear that this is not a best practice use case of SQS. For most uses of SQS there is a need to keep track of the messages being processed in some kind of permanent state such as a database. With a persistent data store containing the processed messages, the queue workers can more effectively process messages if messages are delivered one or more times. That being said lets go over this proof of concept.
Assuming AWS keys with correct permissions are configured and the AWSPowerShell module is loaded, the below command will create a new SQS queue with PowerShell. The command returns the created queue url which will be stored in a variable $NewSQSQueueUrl for future use.
A quick peek at the SQS console to ensure the queue was created.
This next bit creates an array of strings which will serve as some example information to share between the systems. For this proof of concept I am sending example PowerShell parameters into the SQS queue.
PowerShell
1
$exampleparams=@("All The Things","Another Example Parameter")
I have written the POC functions which are also uploaded to my GitHub PowerShell repo that get dot sourced. These functions put the information (example parameters) into the new SQS queue as message attributes of the newly created SQS message.
After running these functions the message ids are returned to the PowerShell host indicating the messages have been inserted into the SQS queue successfully.
Below is the function that was dot sourced that did the uploading. You could customize this to fit your use case with some help from the AWS Send-SQSMessage cmdlet documentation.
Send-SQSMessage-QueueUrl$SQSQueueUrl-MessageAttributes$messageAttributes-MessageBody"Request generated by $($env:Username) at $(Get-Date -Format u)"
}
}
catch{
Write-Error$_
}
}
}
With messages being put into the queue, I need a function to pull down the messages and process them on the queue worker system (aka the SQS message receiver). My goal is to take different actions on the queue worker system based on the message attributes of the SQS messages pulled out of the queue. That function looks something like this.
Start-SQSQueueProccessing
PowerShell
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
functionStart-SQSQueueProccessing
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[String]$SQSQueueUrl
)
Process
{
try
{
Write-Verbose"$(Get-Date -Format u) Polling for $SQSQueueUrl messages"
}else{Write-Verbose"$(Get-Date -Format u) ... No messages were pulled from the queue...nothing to do right now"}
}
catch{
Write-Error$_
}
}
}
This function isn’t actually doing anything interesting with the messages other than generating some output to the PowerShell streams but this is a proof of concept after all :).
Considerations when using SQS
As SQS is designed to decouple distributed systems, SQS does not assume every message pulled from the queue has been processed successfully. Messages that are pulled from the queue are hidden from the queue until the message visibility timeout period has passed. It is up to the queue workers to delete the messages from the queue after the message has been processed. This is why at the end of the function above, messages are deleted from the queue with the Remove-SQSMessage cmdlet.
After working with SQS a bit, I noticed that the behavior surrounding the delivery of messages sitting in the queue is a little unintuitive. For example, say there are 8 messages in a queue and I request for up to 10 messages to be received with Receive-SQSMessage. A logical assumption would be that all 8 messages are returned but that is rarely the case. After working with some messages in queues it becomes quite apparent that SQS will return a random number of messages. Additionally without using FIFO (First-In First-Out) queues, the messages will often be delivered out of order.
Another bit of a gotcha I ran into at first was that by default, Recieve-SQSMessage will not return any message attributes from SQS. The resulting Amazon.SQS.Model.Message object that was returned had blank MessageAttributeValues until I specified “-MessageAttributeName All” parameter.
Hopefully the above considerations will shed some light on the way the function is written. I wrote it so that it could be run repeatedly from a parent polling script and that it could handle one more more message objects being returned from each poll of SQS.
Back to the functions
Finally, we get to the polling function portions of the script which could run on scheduled intervals via task scheduler. This function first checks a queue for the existence of messages using the Get-SQSQueueAttribute cmdlet. If messages are found in the queue, it will invoke the Start-SQSQueueProcessing function referenced above to handle the messages. I make use PowerShell transcription to keep a log for now. If this ever moves out of proof of concept, logging could be improved quite a bit to make it cleaner.
What does it look like when ran you may be wondering?
The PowerShell transcript output captures the same information. As you may have noticed, the body of the generated SQS messages contains information on who created the SQS message and when it was created which could help for audit trails.
Here is some example code which may help you automate security group creation with PowerShell. I wanted to take a look at automating some security group creation tasks today and there wasn’t too much help available via search engines. Maybe this post will help that out a bit.
The minimum amount of IAM permissions needed to accomplish this task will be:
IAM Policy for this script
Vim
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"Stmt1392679134000",
"Effect":"Allow",
"Action":[
"ec2:Describe*",
"ec2:AuthorizeSecurityGroupIngress",
"ec2:CreateSecurityGroup",
"ec2:CreateTags"
],
"Resource":[
"*"
]
}
]
}
This snippet of powershell will:
Lookup the only VPC in your account, provided your regional defaults are set via Initialize-AWSDefaults or the ec2 instance you are running this on. This is helpful as some of the powershell cmdlets only play nice with the default vpc, which many people tend to delete.
Create a new security group for a load balancer
Allow HTTP and HTTPS traffic ingress into the load balancer security group
Create a new security group for a web server
Allow HTTP from the load balancer to the web server security group
Allow SSH from a security group that is looked up by the name “My Bastion Host Security Group” to the web server
Today I am going to attempt to take some PowerShell functions I wrote on Windows and run them on Linux. This should all be possible now that Microsoft Loves Linux! With the new .Net (core) going open-source and cross platform combined with AWS’s Tools for PowerShell core, I should be able to run the exact same functions across Windows and Linux.
For this exercise I will be using a Ubuntu virtual machine on Hyper-V but this could easily be done on CentOS or other various linux distros. Microsoft recently added support for installing PowerShell through popular distro’s default package managers so we will take that approach to get up and running.
Enough intro lets get to it! I am going to use Microsoft’s provided steps in a bash terminal window to register the Microsoft repo and get the latest PowerShell 6 alpha installed and running.
After running those commands, PowerShell is installed and the system leaves us at the PowerShell command prompt.
To verify everything is working I can use $psversiontable to output our PowerShell info to the host.
Okay, everything is looking good so far.
Loading AWS Tools for PowerShell Core
Next up is to get AWS Tools for PowerShell core loaded. This can be done with the new PowerShell package management cmdlets specifically Install-Module.
PowerShell
1
Install-Module-NameAWSPowerShell.NetCore
Oh No, a red error appeared! Quick, email this error to our System Administrator to figure out what went wrong! Haha, just kidding. Lets read it.
The error says administrator rights are required to install modules. The suggestions are to try to change the scope via parameter or to use elevated rights. Well, run as administrator sure won’t work on Linux, so I will do the equivalent and exit out of PowerShell then sudo powershell back into the PowerShell host.
After a retry of the Install-Module command from the now elevated PowerShell host, the Install-Module command completes without error.
I want to check to see the available modules with the get-module command and verify the AWSPowerShell.NetCore module is listed now that its installed.
PowerShell
1
Get-Module-listavilable
Everything checks out and the AWS module is listed right at the top.
Loading my AWS functions from GitHub
I don’t plan on doing any editing of my functions or commits from this system, so I can skip configuring Git and just install it right from the package manager. The neat thing about using Git is that all the nuances that come from working on files between *nix and Windows, like different carriage returns, should be handled behind the scenes by Git.
Shell
1
sudo apt-getinstall git
Once git is installed I can clone the PowerShellScripts repository from my github.
A quick ls and cd is used to make sure the AWSFunctions folder came down with the repository.
Creating AWS Read Only Access Keys
Since this is just a proof of concept exercise, I am going to run a function I built to check the status of a running EC2 Instance by looking up its Name tag. The only access I need for this in AWS IAM is the ability to describe my instances so we can create a new IAM User with an attached EC2 Read only policy.
The IAM console has really become simple to use with recent updates but lets cover everything step by step.
First I’ll log into my AWS account and navigate to the IAM console. From there I want to choose Users and then use the Add User button.
I will call the user blogpostec2readonly and check the box for programmatic access, which will generate our access keys.
On the next screen I will choose Attach existing policies directly. The filter box directly below can be used to search for “ec2readonly” and an AWS managed policy for EC2 Read Only will appear. This managed policy is prewritten json IAM policy maintained by Amazon that helps administrators quickly provide permissions without needing to deep dive into IAM permissions. Perfect for our use case at hand. I’ll check the box for this policy and click next.
The next screen is a review screen and a final Create User button.
After the new IAM user is created the access key and secret key are provided for download. Be careful with these, as AWS access keys are all that is needed to access an AWS account. I will copy the provided access keys into the gedit text editor so I can use them in the next step.
Configuring AWS PowerShell Module Credentials
All the prep work is nearly completed and the next steps are to configure the default region, access key, and secret keys to be used with the AWS PowerShell module cmdlets. To do this we will import the AWSPowerShell.NetCore module and run the Set-AWSCredentials and Initialize-AWSDefaults cmdlets.
I need to load my functions into memory so lets use Get-ChildItem to list the functions files and dot source each one. (% in PowerShell is a shorthand alias for ForEach-Object)
PowerShell
1
Get-ChildItemAWSFunctions|%{.$_.FullName}
To verify my custom functions are loaded and ready to execute we can try to tab complete them. The function I am running in this exercise is Test-RunningEC2InstanceByServerName so I will type Test-Run and press tab.
Success! Tab completion filled out the name of function for me. Lets see if it works…
The Instance hosting this here blog is called PACKETLOST02 so I will send that server name in as a parameter into the function and I am expecting it to return that the instance is running.
The function ran and returned that the instance is running.
Summary
How neat was this? I took some PowerShell functions I wrote on the Windows platform and commited them into my GitHub repo then got them to run on Linux. When I initially wrote these functions it was to help automate my day to day administration of Amazon Web Services. I wrote these functions on the Windows platform with only the Windows platform in mind. Thanks to the great work of the developers at Microsoft and Amazon Web Services these functions are now cross platform.
I hope this post provides a quick glance into how useful and flexible PowerShell can be as well as how promising the future of the .NET core and the .NET standard libraries are to cloud computing. Cheers!
So I followed this AWS blog and this documentation to launch a tiny t2 elasticsearch cluster to visualize VPC flow logs. Those links have instructions that guide you along setting up flow logs to flow into ES in a few different ways. I ended up following the documentation link and then downloading some kibana3 dashboards until I found one I liked.
Over time however, the little t2 ES cluster could not keep up, and I ran out of storage space and CPU credits. So I wanted to automate the deletion of indices / indexes so that the cluster would free up storage space and not churn through CPU. With more RAM available the cluster uses less CPU, so I had to limit how much data the single node ES cluster is storing. There is plenty of documentation online on how to use curl to delete elasticsearch indexes but I’m on windows most of the time so I decided to write a quick a powershell script to do it.
To use this script just update the esdomain variable to point to your ES cluster name. Also this filter will only work if the lambda script is creating cwl- indexes. Tweak it if your indexes are different. Run it and it will keep the last 2 weeks of indexes and delete anything older.
I tend to copy scripts to c:\scripts on systems so I copied deleteold.ps1 there and created two new batch files there as well.
The first was used to launch the powershell script with switches that will delete all files in the IIS Logs directory older than 90 days and append to a log of all actions taken.
The second batch file registers a task that runs this script weekly. Right click this batch and run as administrator and IIS logs will no longer be a thorn in your side.
As usual, customize as needed! Next on my list is for this repetitive task is creating a powershell remoting script to copy these files to remote servers and execute the task register command.
The last two posts I made utilize PowerCLI and VMware tools to gather information for reporting. So what happens if VMware tools is not running…. things fall through the cracks!
Luckily, we can report on VMware tools status too.
Whether your developers are keeping snapshots around for months or your backup program is leaving behind old snapshots due to backup failures, its always good to know about your aging snapshots.
Below is a script which will find all VMs with snapshots older than 7 days. As usual you can customize this to meet your needs, for me I like to keep transcripts and output files for archival and troubleshooting.