In this blog, we will learn how to list down all buckets in our AWS account using Python and AWS CLI. We will learn different ways to list buckets and filter them using tags.

Using CLI to list S3 bukctes

Listing all bucktes

We can list buckets with CLI in one single command.

aws s3api list-buckets
Listing buckets with AWS CLI
Listing buckets with AWS CLI

If you have lots of buckets this output will become difficult to follow. But AWS CLI now supports query parameters. Using query parameters we can extract the required information from the output.

	
aws s3api list-buckets --query "Buckets[].Name"
AWS CLI – listing buckets with query flag
AWS CLI – listing buckets with query flag

We can also use jq (a lightweight command-line JSON parser) to do some funky things. The following code will print bucket names along with tags associated with them.

for bucket in `aws s3api list-buckets --profile admin-analyticshut | jq .Buckets[].Name | tr -d \"`; do
    echo $bucket
    tags=$(aws s3api get-bucket-tagging --bucket elasticbeanstalk-ap-south-1-195556345987 --profile admin-analyticshut | jq -c '.[][] | {(.Key): .Value}' | tr '\n' '\t')
    echo $tags
done

Listing S3 bucktes using python

We can also easily list down all buckets in the AWS account using python.

import boto3
from botocore.exceptions import ClientError

#
# Option 1: S3 client list of buckets with name and is creation date
#
s3 = boto3.client('s3')
response = s3.list_buckets()['Buckets']
for bucket in response:
    print('Bucket name: {}, Created on: {}'.format(bucket['Name'], bucket['CreationDate']))

When we run the above code we will get the following output.

Python – listing buckets with boto3 client
Python – listing buckets with boto3 client

Boto3 also provides us with Bucket resources. We can use its all() function to list down all buckets in the AWS account.

import boto3
from botocore.exceptions import ClientError

#
# option 2: S3 resource object will return list of all bucket resources.
# This is useful if we want to further process each bucket resource.
#
s3 = boto3.resource('s3')
buckets = s3.buckets.all()
for bucket in buckets:
    print(bucket)
Python listing AWS buckets with Boto3 resource
Python listing AWS buckets with Boto3 resource

I also tried buckets filtering based on tags. You can have 100s if not thousands of buckets in the account and the best way to filter them is using tags. Boto3 does provide a filter method for bucket resources. But I did not find how we can use it. So I tried a workaround to filter buckets using tag value in python.

#
# Option 3: Filtering buckets
# This is not working as i have expected. There is filter function for bucket resource.
# But there is no mention of how to use it.
# https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.ServiceResource.buckets
# If anyone know how to use it please let us all know
# buckets1 = s3.buckets.filter(Filters=[{'Name': 'tag:Status', 'Values': ['Logs']}])
# Filtering all buckets with specific tag value
# Same method can be sued to filter buckets with specific string in its name.
for bucket in buckets:
    try:
        tag_set = s3.BucketTagging(bucket.name).tag_set
        for tag in tag_set:
            tag_values = list(tag.values())
            if tag_values[0] == 'Status' and tag_values[1] == 'Logs':
                print(bucket.name)
    except ClientError as e:
        pass
        # print('No Tags')

If you find how to use the filter method for this approach please let me know. Here is the actual function give by boto3.

Conclusion

We have learned how to list down buckets in the AWS account using CLI as well as Python. Next in this series, we will learn more about performing S3 operations using CLI and python. If you are interested, please subscribe to the newsletter. See you in the next blog.

Similar Posts