I am pulling data from an API in batches and sending it to an SQS Queue. Where I am having an issue is processing the message in order to send the data to DynamoDB. There is supposed to be 147,689 records in the dataset. However, when running the code, sometimes less than 147,689 records will be put to DynamoDB, sometimes more than 147,689 records will be put to DynamoDB, and sometimes 147,689 records will be put to DynamoDB. It is not consistently putting 147,689 records into the database.
I have tried everything I can think of to try and fix this issue including (utilizing a Fifo queue instead of a standard queue, increasing the visibility timeout, increasing the delivery timeout, using uuid.uuid1() instead of uuid.uuid4()) I am looping through the "Record" list so not sure why it is not processing the entire batch. Below is my latest code to process the message and send the data to DynamoDB:
import boto3
import json
import uuid
import time
dynamo = boto3.client("dynamodb", "us-east-1")
def lambda_handler(event, context):
for item in json.loads(event["Records"][0]["body"]):
item["id"] = uuid.uuid1().bytes
for key, value in item.items():
if key == "id":
item[key] = {"B": bytes(value)}
elif key == "year":
item[key] = {"N": str(value)}
elif key == "amt_harvested":
item[key] = {"N": str(value)}
elif key == "consumed":
item[key] = {"N": str(value)}
else:
item[key] = {"S": str(value)}
time.sleep(0.001)
dynamo.put_item(TableName="TableOne", Item=dict(item))
question from:
https://stackoverflow.com/questions/65646101/dynamodb-not-receiving-the-entire-sqs-message-body 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…