I created the following lambda which prints each object size and sums up the total bucket size.
I use convert_size function from here.
Credit to @James Sapam.
Code snippet :
import boto3
import math
def convert_size(size_bytes):
if size_bytes == 0:
return "0B"
size_name = ("B", "KB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB")
i = int(math.floor(math.log(size_bytes, 1024)))
p = math.pow(1024, i)
s = round(size_bytes / p, 2)
return "%s %s" % (s, size_name[i])
bucket_name = 'BUCKET_NAME'
s3 = boto3.resource('s3')
bucket = s3.Bucket(bucket_name)
def lambda_handler(event, context):
all_objects = bucket.objects.all()
total_size = 0
for obj in all_objects:
if obj.key.split('/')[-1]:
file_name = obj.key
file_size = convert_size(obj.size)
total_size += obj.size
print("File Name: %s File Size: %s" % (file_name,file_size) )
print("%s bucket size : %s" % (bucket_name,convert_size(total_size)) )
Policy summary JSON :
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::${BUCKET_NAME}"
}
]
}
Output :
If after trying the above solution you still find issues, take a look at this thread.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…