How to Get the Size of an Amazon S3 Bucket

RECOMMENDED: Click here to fix Windows errors and optimize system performance

For reasons I never understood, the Amazon AWS S3 object file store does not provide metadata on the size and number of objects in a bucket. This means that in order to answer the simple question “How do I get the total size of an S3 bucket”, the bucket must be scanned to count the objects and determine the total size. This is slow, especially when you have millions of objects in a bucket.

For AWS, finding the size of an S3 bucket is not very intuitive and hidden in the menus. Learn how to find the total size, display it graphically in CloudWatch or retrieve it programmatically from the command line.

How to Get the Size of an Amazon S3 Bucket using the CLI

AWS CLI now supports the –query parameter, which takes a JMESPath expression.

This means that you can sum the size values specified by the objects in the list with sum(Contents[].Size) and count as length(Contents[]).

This can be done using the official AWS CLI as follows and was introduced in February 2014

aws s3api list-objects –bucket BUCKETNAME –output json –query “[sum(Contents[].Size), length(Contents[])]”

January 2021 Update:

We now recommend using this tool for your error. Additionally, this tool fixes common computer errors, protects you against file loss, malware, hardware failures and optimizes your PC for maximum performance. You can fix your PC problems quickly and prevent others from happening with this software:

  • Step 1 : Download PC Repair & Optimizer Tool (Windows 10, 8, 7, XP, Vista – Microsoft Gold Certified).
  • Step 2 : Click “Start Scan” to find Windows registry issues that could be causing PC problems.
  • Step 3 : Click “Repair All” to fix all issues.



This can now be done trivially with the official AWS command line client alone:

aws s3 ls –summarize –human-readable –recursive s3://bucket-name/

It also accepts path prefixes if you don’t want to count the whole range:

aws s3 ls –summarize –human-readable –recursive s3://bucket-name/directory

Thus, the size of large memory areas is displayed much faster than by recursively adding up file sizes, because the memory space actually used is recalled. It is also human-readable if you pass the -H flag, so you don’t need to take your calculator out.

Using the AWS Web Console and Cloudwatch

  • Go to CloudWatch
  • Click on the measurements on the left side of the screen
  • Click on S3
  • Click on “Save
  • You’ll see a list of all the buckets. Note that there are two possible points of confusion here:
    • a. You will only see buckets containing at least one object.
    • b. You may not see buckets that have been created in another region and you may have to change regions using the drop-down menu in the top right-hand corner to see additional buckets.
  • Look for the word “StandardStorage” in the “Search for a metric, dimension or resource identifier” box.
  • Select the areas (or all areas with the check box to the left under the word “All”) for which you want to calculate the total size.
  • Select at least 3d (3 days) or more in the timeline at the top right of the screen.

You will now see a graph showing the daily size (or other unit) of the list of all the selected buckets over the selected period.

RECOMMENDED: Click here to troubleshoot Windows errors and optimize system performance