Code assistance for boto3, always up to date and in any IDE

21. December 2018 2018 0

If you’re like me and work with the boto3 sdk to automate your Ops, then you probably are familiar with this sight:

image1

No code completion! It’s almost as useful as coding in Notepad, isn’t it? This is one of the major quirks of the boto3 sdk. Due to its dynamic nature, we don’t get code completion like for other libraries like we are used to.

I used to deal with this by going back and forth with the boto3 docs. However, this impacted my productivity by interrupting my flow all the time. I had recently adopted Python as my primary language and had second thoughts on whether it was the right tool to automate my AWS stuff. Eventually, I even became sick of all the back-and-forth.

A couple of weeks ago, I thought enough was enough. I decided to solve the code completion problem so that I never have to worry about it anymore.

But before starting it, a few naysaying questions cropped up in my head:

  1. How would I find time to support all the APIs that the community and I want?
  2. Will this work be beneficial to people not using the X IDE?
  3. With 12 releases of boto3 in the last 15 days, will this become a full time job to continuously update my solution?

Thankfully, I found a lazy programmer’s solution that I could conceive in a weekend. I put up an open source package and released it on PyPI. I announced this on reddit and within a few hours, I saw this:

image4

Looks like a few people are going to find this useful! 🙂

In this post I will describe botostubs, a package that gives you code completion for boto3, all methods in all APIs. It even automatically supports any new boto3 releases.

Read on to learn a couple of less-used facilities in boto3 that made this project possible. You will also learn how I automated myself out of the job of maintaining botostubs by leveraging a simple deployment pipeline on AWS that costs about $0.05 per month to run.

What’s botostubs?

botostubs is a PyPI package, which you can install to give you code completion for any AWS service. You install it in your Python runtime using pip, add “import botostubs” to your scripts and a type hint for your boto3 clients and you’re good to go:

image18

Now, instead of “no suggestions”, your IDE can offer you something more useful like:

image17

The parameters in boto3 are dynamic too so what about them?

With botostubs, you can now get to know which parameters are supported and also which are required or optional:

image10

Much more useful, right? No more back-and-forth with the boto3 docs, yay!

The above is for Intellij/PyCharm but will this work in other IDEs?

Here are a couple of screenshots of botostubs running in Visual Studio Code:

image2

image3

Looks like it works! You should be able to use botostubs in any IDE that supports code completion from python packages.

Why is this an issue in the first place?

As I mentioned before, the boto3 sdk is dynamic, i.e the methods and APIs don’t exist as code. As it says in the guide,

It uses a data-driven approach to generate classes at runtime from JSON description files …

The SDK maintainers do it to be able to enhance the SDK reliably and faster. This is great for the maintainers but terrible for us, the end users of the SDK.

Therefore, we need statically defined classes and methods. Since boto3 doesn’t work that way, we need a separate solution.

How botostubs works

At a high level, we need a way to discover all the available APIs, find out about the method signatures and package them up as classes in a module.

  1. Get a boto3 session
  2. Loop over its available clients
  3. Find out about each client’s operations
  4. Generate class signatures
  5. Dump them in a Python module

I didn’t know much about boto3 internals before so I had to do some digging on how to accomplish that. You can use what I’ve learnt here if you’re interested in building tools on top of boto3.

First, about the clients. It’s easy when you already know which API you need, e.g with S3, you write:

client = boto3.client(‘s3’)

But for our situation, we don’t know which ones are there in advance. I could have hardcoded them but I need a scalable and foolproof way. I found out that the way to do that is with a session’s get_available_services() facility.

image19

Tip: Much of what I’ve learnt have been though Intellij’s debugger. Very handy especially when having to deal with dynamic code.

image13

For example, to learn what tricks are involved to get the dynamic code to convert to actual API calls to AWS, you can place a breakpoint in _make_api_call found in boto3’s client.py:

image14

Steps 1 and 2 solved. Next, I had to find out which operations are possible in a scalable fashion. For example, the S3 API supports about 98 operations for listing objects, uploading and downloading them. Coding 98 operations is no fun, so I’m forced to get creative.

Digging deeper, I found out that clients have an internal botocore’s service model that had everything that I was looking for. Through the service model you can find the service documentation, api version, etc.

Side note: botocore is a factored out library that is shared with the AWS CLI. Much of what boto3 is capable is actually powered by botocore.

In particular, we can read the available operation names. E.g the service model for the ACM api returns:

image6

Step 3 was therefore solved with:

image23

Next, we need to know what parameters are available for each operation. In boto parlance, they are called “input shapes”. (Similarly, you can get the output shape if needed) Digging some more in the service model source, I found out that we can get the input shape with the operation model:

image12

This told me the required and optional parameters. The missing part of generating the method signatures was then solved. (I don’t need the method body since I’m generating stubs)

Then it was a matter of generating classes based on the clients and operations above and package them in a Python module.

For any version of boto, I had to run my script, run the twine PyPI utility and it will output a PyPI package that’s up to date with upstream boto3. All of that took about 100 lines of Python code.

Another problem remained to be solved though; with a new boto3 release every time you change your t-shirt, I would need to run it and re-upload to PyPI several times a week. So, wouldn’t this become a maintenance hassle for me?

The deployment pipeline

To solve this problem, I looked to AWS itself. The simplest way I found out was to use their build tool and invoke it on a schedule. What I want is a way to get the latest boto3 version, run the script and upload the artefact to PyPI. All without my intervention.

The relevant AWS services to achieve this is Cloudwatch Events (to trigger other services on a schedule), CodeBuild (managed build service in the cloud) and SNS (for email notifications). This is what the architecture looks like on AWS:

image20

Image generated with viz-cfn

The image above describes the CloudFormation template used to deploy on Github as well as the code.

The AWS Codebuild Project looks like this:

image5

image15

To keep my credentials outside of source control, I also attached a service role to give CodeBuild permissions to write logs and read my PyPI username and password from the Systems Manager parameter store.

I also enabled the build badge feature so that I can show the build status on Github:

image7

 

 

For intricate details, check out the buildspec.yml and the project definition.

I want this project to be invoked on a schedule (I chose every 3 days) and I can accomplish that with a Cloudwatch Event Rule:

image11

When the rule gets triggered, I see that my codebuild project does what it needs to do; clone the git repo, generate the stubs and upload to PyPI:

image21

This whole process is done in about 25 seconds. Since this is entirely hands off, I needed some way to be kept in the loop. After the build has run, there’s another Cloudwatch Event which gets triggered for build events on the project. It sends a notification to SNS which in turns sends me an email to let me know if everything went OK:

image9

The build event and notification.

image8

The SNS Topic with an email subscription.

That’s it! But what about my AWS bill? My estimate is that it should be around $0.05 every month. Besides, it will definitely not break the bank, so I’m pretty satisfied with it! Imagine how much it would cost to maintain a build server on your own to accomplish all of that.

What’s with the weird versioning?

You will notice botostubs versions look like this:

image22

It currently follows boto3 releases in the format 0.4.x.y.z. Therefore, if botostubs is currently at 0.4.1.9.61, then it means that it will offer whatever is available in boto3 version 1.9.61. I included the boto version in mine to make it more obvious what version of boto3 that botostubs was generated from but also because PyPI does not allow uploads at the same version number.

Are people using it?

According to pypistats.org, botostubs has been downloaded about 600 times in its initial week after I showed it to the reddit community. So it seems that it was a tool well needed:

image16

Your turn

If this sounds that something that you’ll need, get started by running:

pip install botostubs

Run it and let me know if you have any advice on how to make this better.

Credit

Huge thanks goes to another project called pyboto3 for the original idea. The issues that I had with it was that it was unmaintained and supported legacy Python only. I wouldn’t have known that this would be possible were it not for pyboto3.

Open for contribution

botostubs is an open source project, so feel free to send your pull requests.

A couple of areas where I’ll need some help:

  • Support Python < 3.5
  • Support boto3 high-level resources (as opposed to just low-level clients)

Summary

In this article, I’ve shared my process for developing botostubs through examining the internals of boto3 and automate its maintenance with a deployment pipeline that handles all the grunt work. If you like it, I would appreciate it if you share it with a fellow Python DevOps engineer

https://pypi.org/project/botostubs/.

I hope you are inspired to find solutions for AWS challenges that are not straightforward and share them with the community.

If you used what you’ve learnt above to build something new, let me know, I’d love to take a look! Tweet me @jeshan25.

About the Author

Jeshan Babooa is an independent software developer from Mauritius. He is passionate about all things infra automation on AWS especially with tools like Cloudformation and Lambda. He is the guy behind LambdaTV, a youtube channel dedicated to teaching serverless on AWS. You can reach him on Twitter @jeshan25.

About the Editors

Ed Anderson is the SRE Manager at RealSelf, organizer of ServerlessDays Seattle, and occasional public speaker. Find him on Twitter at @edyesed.

Jennifer Davis is a Senior Cloud Advocate at Microsoft. Jennifer is the coauthor of Effective DevOps. Previously, she was a principal site reliability engineer at RealSelf, developed cookbooks to simplify building and managing infrastructure at Chef, and built reliable service platforms at Yahoo. She is a core organizer of devopsdays and organizes the Silicon Valley event. She is the founder of CoffeeOps. She has spoken and written about DevOps, Operations, Monitoring, and Automation.


Getting Started with Boto

05. December 2012 2012 0

Boto is a Python library that provides you with an easy way to interact with and automate using various Amazon Web Services.

If you’re familiar with Python or interested in learning it, in conjunction with learning and use AWS, you won’t find a better option than Boto.

Installing

Installing boto is very straightforward, assuming your using an OS with pip installed. If you do not currently have pip, then do that first.

Once you have pip, the following command will get you up and running.

pip install boto

Basic configuration

This configuration assumes you’ve already created an AWS account and obtained your API Key and Secret Access Key from IAM in the AWS console

With those in hand, you’ll want to create a .boto file in your home directory and populate it with the secrets.

  • Example .boto:

    [Credentials]

    aws_access_key_id = <your access key>

    aws_secret_access_key = <your secret key>

  • There are some additional configurations you can set, as needed, for debugging, local proxies, etc, as shown below

    [Boto]

    debug = 0

    num_retries = 10

    proxy = myproxy.com

    proxy_port = 8080

    proxy_user = foo

    proxy_pass = bar

Using boto with EC2

Now that you have a basic .boto file, you can begin using boto with AWS resources.

The most likely place to start is connecting to EC2 and making an instance, which can be done with a few short lines of code.

simple-ec2.py

import boto.ec2

regions = boto.ec2.regions()

oregon = regions[4]

# known from looking at regions[]

e = boto.ec2.EC2Connection(region=oregon)

# EC2Connection() will pick up your keys from .boto

conn.run_instances('<ami-image-id>')

You can also specify a number of options to the AMI you’re launching.

options-ec2.py

import boto.ec2

regions = boto.ec2.regions()

oregon = regions[4]

# known from looking at regions[]

e = boto.ec2.EC2Connection(region=oregon)

# EC2Connection() will pick up your keys from .boto

conn.run_instances('<ami-image-id>'

key_name='myKey',

instance_type='c1.xlarge',

security_groups=['your-security-group-here'])

The EC2 API has a number of options and function calls you will find useful in managing your EC2 resources with boto.

Using boto with VPC

EC2 isn’t the only service boto supports, one of my favorites, VPC is also supported.

With a few short lines of code, you can create a VPC and its various objects.

vpc.py

# this will create a VPC, a single subnet, and attach an internet gateway to the VPC

import boto.vpc

regions = boto.ec2.regions()

oregon = regions[4]

# known from looking at regions[]

v = boto.vpc.VPCConnection(region=oregon)

vpc = v.create_vpc('10.20.0.0/24')

subnet = v.create_subnet(vpc.id, '10.20.10.0/24')

ig = v.create_internet_gateway()

v.attach_internet_gateway(ig, vpc.id)

The VPC API has a number of options and function calls you will find useful in managing your EC2 resources with boto.

What AWS resources are supported?

A variety of services are supported. According to the boto README, they are currently

Compute

  • Amazon Elastic Compute Cloud (EC2)
  • Amazon Elastic Map Reduce (EMR)
  • AutoScaling
  • Elastic Load Balancing (ELB)

Content Delivery

  • Amazon CloudFront

Database

  • Amazon Relational Data Service (RDS)
  • Amazon DynamoDB
  • Amazon SimpleDB

Deployment and Management

  • AWS Identity and Access Management (IAM)
  • Amazon CloudWatch
  • AWS Elastic Beanstalk
  • AWS CloudFormation

Application Services

  • Amazon CloudSearch
  • Amazon Simple Workflow Service (SWF)
  • Amazon Simple Queue Service (SQS)
  • Amazon Simple Notification Server (SNS)
  • Amazon Simple Email Service (SES)

Networking

  • Amazon Route53
  • Amazon Virtual Private Cloud (VPC)

Payments and Billing

  • Amazon Flexible Payment Service (FPS)

Storage

  • Amazon Simple Storage Service (S3)
  • Amazon Glacier
  • Amazon Elastic Block Store (EBS)
  • Google Cloud Storage

Workforce

  • Amazon Mechanical Turk

Other

  • Marketplace Web Services

Automating with boto

As you can see from the examples above, you can very quickly begin automating your AWS resources with boto.

As you learn boto there are a number of resources to consult.

  1. There are a number of Tutorials for some services to help you get started
  2. The [API documentation] is very comprehensive.
  3. I find bpython to be very helpful, as it’s autocompletion makes it easily to quickly and interactively learn new parts of a library. Obligatory bpython and boto action shot
  4. Reading the boto source code. Never underestimate the power of just going to the source. Looking under the hood and seeing how things are put together can be very valuable and educational.
  5. Join the community. [#boto](irc://irc.freenode.net:6667/boto) on freenode and the google group are both excellent places to start.