The idea behind is to have an independent environment to integrate Amazon Web Services’ objects and services with Python applications.

The GitHub repository with example can be found here. The README.md will probably serve you better than this blog post if you just want to get started.

The environment is offered in a form of a Docker container, which I am running on Windows 10. The above repository has a DockerFile available so the container can be build wherever.

Python 3 is the language of choice to work against the AWS and for that a library boto3 is needed. This is an AWS SDK for Python and it is used to integrate Python applications with AWS services.

Bare minimum

To get started, all is needed is access key and secret key (which requires an IAM user with assigned policies), Python and installed boto3.

The policies the user gets assigned are going to reflect in the Python code. It can be frustrating at the beginning to assign the right policies so maybe for the purpose of testing, give the user all rights to a service and narrow it down later.

Where to begin

The best service to begin with is object data storage AWS S3 where you can manipulate with buckets (folders) and objects (files). And you also see immediate results in AWS console. Costs are also minimal and there are no services running “under” S3 that need attention first. My repository has a simple Python package which lists all available buckets.

Credentials and sessions

To integrate Python application and AWS services, an IAM user is needed and users access key and service key. They can be provided in different ways, in this case, I have used sessions – which allow users (dev, test, prod…) to change at runtime. This example of credentials file with sessions gives the general idea about how to create multiple sessions.

The Python test file shows how to initialize a session.

Exception handling

Handling exceptions in Python3 and with boto3 is demonstrated in the test package. Note that the excpetion being caught is a boto3 exception.

Further work

The environment is set up, PyCharm can be used for software development while Docker can execute the tests.

There is nothing stopping you from developing a Python application.

After gaining some confidence, it would be smart to check the policies and create policies that allow a user or group excatly what they need to be allowed.

Dilemma

How far will boto3 take one organization? Is it smart to consider using, for example, Terraform when building VPC and launching EC2 instances?

It is worth making that decision and use an Infrastructure-as-Code tool on a higher level to automate faster. And prehaps use boto3 to do more granular work like manipulating objects in S3 or dealinh with users and policies.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s