I deploy to AWS a lot. I believe it’s fine to use AWS GUI when you explore things, but otherwise it is better to write scripts to achieve results. Be it Bash scripts that use AWS CLI or Python scripts that use boto3 library. Writing scripts guarantees that when you forget how to properly deploy a cluster of ElasticSearch instances and shards you will just use your script instead of researching AWS documentation again. AWS CLI is a Python library installed via pip.
I try to keep the installation of AWS CLI isolated from everything else. Making it possible to have multiple installations with different versions. Here is how I achieve that on MacOS.
~/.aws/directory. This directory will be automatically created when you configure your AWS CLI with
aws configurecommand. However, I prefer to create it ahead of time, because the installation will be kept here.
~/.aws/.python-versionfile that contains Python version that will be used for AWS CLI. As of now, this file contains
3.6.5. This, of course, assumes that you use Pyenv to manage Python installations. Also the version that you chose must have Pipenv installed in it.
awsclilibrary. Once you are in
~/.aws/directory just run
pipenv install awscli
This will create a dedicated virtual environment and install AWS CLI there so that it’s exclusively available only from that place. Pipenv will know which virtual environment to use in this directory thanks to the
We have an isolated and clean installation of AWS CLI. Now, how can we make it available across the system? At the moment the only way to start AWS CLI is to fire up Pipenv from the directory with Pipfile.
pipenv run aws
Instead, let’s make a symlink that will redirect any requests to
awscommand into our dedicated virtual environment. It’s pretty easy.
~/.aws/bin/awsfile there with the following script inside:
Make the script executable
chmod a+x ~/.aws/bin/aws
Create a symlink to this executable script
- Voilà, the
awsexecutable is now available from any directory.