By: Douglas Correa | Updated: 2017-12-18 | Comments (4) | Related: > Amazon AWS
Problem
Amazon has delivered good tools to work with their services and AWSPowershell is a good way to work with these serivces. The AWS Tools for Windows PowerShell and AWS Tools for PowerShell Core are PowerShell modules that are built on the functionality exposed by the AWS SDK for .NET. The AWS PowerShell Tools enable you to script operations on your AWS resources from the PowerShell command line.
One issue we are facing is when you need to send big files from a local disk to AWS S3 bucket upload files in the console browser; this can be very slow, can consume much more resources from your machine than expected and take days to finish. Also, the browser doesn't provide any kind of control in the files. How can we address this item with PowerShell?
Solution
In this first part, I will focus to show how to work with AWS S3 bucket and to solve slowness to upload files to your bucket. First, download and install the AWS SDK using the link https://aws.amazon.com/powershell/.
The installation process is quite simple and you can install all options like I did or install only the AWS Tools for Windows PowerShell.
After installation you will see in the start button a guide for AWS Tools and PowerShell for AWS.
The first option is a guide to how to use the tool installed.
Windows PowerShell for AWS has the AWSPowershell module imported. To know what cmdlets are available, execute:
Get-Command -Module AWSPowershell | Where-Object Name -like *S3*
In this case I want only the S3 cmdlets.
Also, you can use PowerShell ISE to work with the cmdlets available. Running the cmdlets below to have it available.
Install-Package -Name AWSPowerShell Import-Module AWSPowershell
With AWSPowershell we can manage services such as EC2, CloudWatch, IAM, SNS, SQS and so on. Regarding S3 you can create and delete Amazon S3 buckets, upload files to an Amazon S3 bucket as objects, delete objects from an Amazon S3 bucket and much more.
To connect to your AWS account you need the access and secret key to the user account for the session.
Initialize-AWSDefaultConfiguration -AccessKey AKIAIOSFODNN7EXAMPLE -SecretKey wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY -Region us-west-2
That cmdlet saves the specified credential keys and default region selection to a profile named 'default'. The credentials and region are set as active in the current shell.
Let's create variables to set the common information and create a simple script which will verify if the bucket exists and verify if the file exists, if the file doesn't exist it will upload the file to the bucket.
$bucket = 'your bucket name' $source = 'your local path' $AKey = 'AKIAIOSFODNN7EXAMPLE' $SKey = 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY' $region = 'us-east-2' Initialize-AWSDefaultConfiguration -AccessKey $AKey -SecretKey $SKey -Region $region Set-Location $source $files = Get-ChildItem '*.bak' | Select-Object -Property Name try { if(Test-S3Bucket -BucketName $bucket) { foreach($file in $files) { if(!(Get-S3Object -BucketName $bucket -Key $file.Name)) { ## verify if exist Write-Host "Copying file : $file " Write-S3Object -BucketName $bucket -File $file.Name -Key $file.Name -CannedACLName private } } } Else { Write-Host "The bucket $bucket does not exist." } } catch { Write-Host "Error uploading file $file" }
The PowerShell ISE will show the uploading message.
You can see errors if you don't configure your default configuration properly. For example:
This error is because it was trying to run with the wrong region value.
Another error could be for example like:
In this case the parameter -File when run the cmdlet Write-S3Object was wrong but below the error message you can see another with a different information, in this case was "The bucket you are attempting to access must be addressed using the specified endpoint..."
The first line shows the "right" error "The file indicated by the FilePath property does not exist"
Conclusion
This is the first part using PowerShell to work with AWS Services. I did some tests trying to upload via the console browser and took a day to upload 100GB. Using AWSPowershell it took a few hours to upload 14 files with over 160GB in total size in the same network not using S3 boosts.
Also, you could see I created a simple script, to verify if the file exist before upload it and using PowerShell you have much more control with how to send the files, you can change the order, log if there are errors, schedule what time you want to send the files to S3, control if your file exist you can verify the modified date and update or not your file.
Next Steps
About the author
This author pledges the content of this article is based on professional experience and not AI generated.
View all my tips
Article Last Updated: 2017-12-18