Working safely with the AWS Command Line tool

The AWS Command Line provides all the functionality necessary to script and automate your AWS usage. Just like the browser based Console, detail can be managed and overview visualised. One of the challenges to be faced when moving away from the browser interface however, is immediate feedback and prompting. When running a command, it is very easy to mis-type or use the incorrect detail and end up with a very hefty bill or something much worse.

Of course, as it is with any of us that use the command line, we make notes of the commands and write scripts so we don't have to remember them. In the simplest form, these could be bash functions within a profile script, or individual files sitting within a bin directory within our environment. For some people, this is perfectly fine and suitable. For others, this can be restrictive, non-intuitive, and prone to regression or inability to share and learn.

The simplest way to work safer with the command line is to provide only the necessary and explicit permission to the AWS services that are needed for the specific use-case. When API details are generated through the Browser Console, using the Identity & Access Management tool, specific permission sets are selected from. In the AWS environment, these are known as Policies. Taking care to choose these specifically will help ensure that only services intended to be used can be used. For instance, if you know that DynamoDB will not be interacted with, don't add any of the policy items relating to it. As with all AWS services, these are prefixed with AmazonDynamoDB. It is even possible (and a good idea) to restrict access within specific regions of the AWS zones. If all infrastructure and development will be undertaken within Sydney, don't allow access to any other location zone.

Group functionality exists within the Identity & Access Management tool to make policy crafting simple, and re-usable for many users. Within an organisation, it would be cumbersome and dangerous to manage individual policy sets for many users. By creating a group, assigning specific policies just for the type of group, and then assigning users to those groups, large scale policy sets can be defined and maintained. For instance, a Developer group could be created that would allow access to EC2, DynamoDB, and others as necessary, but not to SQS or SNS for example. This would allow a user within the Developer group to interact with instances in the allowed services whilst preventing their API interaction with others. Development environments could be created, developed on by interacting with shared provisions of other services, and managed in an overall capacity by an Administrator.

When running the command for the first time, it will be noted that access details are required. During the above interaction of the Identity & Access Management tool, when policies are selected and added for a user, the Access Key ID and Secret can be created for this purpose. Multiple users could also be created with different permissions, when different service access is required in sets. Management of certain types of services could be completed with users in one location, and completely different services and management can be completed in an entirely separate location.

One of the easiest ways to learn how to use the AWS Command Line is to simply look at the commands, view their help, and try them out. The browser console will contain the most up-to-date information about all services and instances within them, so it's a great idea to keep a tab open in your browser to view any changes that take place from commands that are executed. If worse comes to worse, and an incorrect command is run or unintended outcomes have taken place, having the console readily open will help with reversing any changes quickly.

Managing scripts to give AWS commands a short-hand is an obvious way to save time and prevent some degree of mis-use. Whilst this is fine for simple items, a lot of flexibility is quickly lost when extra options are required or when requirements change. There is also the need to have shared configurations made available between many users. It would make sense in that case to manage some form of a repository where these configurations could be stored and made available, where collaboration could be fostered to provide the same functionality across a wider team.

Manageacloud provides an even simpler approach, reducing errors and duplication, by working with MacfilesInfrastructures, and Configurations. Server Configurations can be stored to readily create instances of common environments. Each server type can have the complete configuration defined, so when a need arises to have an instance made available, it can be. Infrastructures allow for full delpoyment of multiple configurations and service types. To have a replica of the larger system, an infrastructure can be used and all configurations within made available and pre-configured as a single environment. The Macfile is the structure by which an Infrastructure is defined. Server configurations can be referenced within an Infrastructure using the Roles functionality made available.

With this understanding, AWS commands can then be stored within an infrastructure and run as-needed with minimal direct user input. There is not the same danger of selecting an invalid type or service configuration, as this is not required from the user when the infrastructure is run. Each AWS command can be seen easily in the Macfile content, the Infrastructure is completely transparent. 

Written by Allan Shone on Sunday November 22, 2015
Permalink - Tags: aws, cli

« Continuous Delivery for Java, using CircleCI and Manageacloud - Translating the AWS Console to the Command Line »