DEV Community

Srinivasulu Paranduru for AWS Community Builders

Posted on • Edited on

4

Export AWS DynamoDB table values to S3 bucket using Python Boto 3

Use Case : How to download DynamoDB table values to S3 bucket to import in other DynamoDB table in a different account.

Approach : Creating a AWS Lambda in source account where the AWS DynamoDB table is existing and create a S3 bucket in the same source account to export the values using AWS Lambda using Boto3

  1. Create a Lambda - dynamodb-export
  2. Create Environmental variables attached to the lambda and change the values based on S3 bucket name and DynamoDB table name
  3. Create IAM Role with all necessary permissions and attached to AWS Lambda
    • In the code given below i have used DynamoDB table name as mytable1 and same is there in the role policy as well and if ur using more tables then in the policy it needs to be mentioned

  1. Lambda: dynamodb-export
import boto3
import json
import os
from datetime import datetime
import decimal

def convert_decimals(obj):
    print('before decimal check')
    print(type(obj))
    if isinstance(obj, decimal.Decimal):
        print('after decimal check')
        return str(obj)
    elif isinstance(obj,list):
        print('list item')
        return [convert_decimals(i) for i in obj if i is not None]
    elif isinstance(obj,dict):
        print('list dict')
        return {k: convert_decimals(v) for k, v in obj.items() if v is not None}
    elif obj is None:
        return None
    return obj

def lambda_handler(event, context):
    dynamodb = boto3.resource('dynamodb')
    s3 = boto3.client('s3')
    tablename = os.environ['dynamodb_table_name']
    s3_bucket_name = os.environ['s3_export_bucket_name']
    table = dynamodb.Table(tablename)
    response = table.scan()
    items = response['Items']
    items = convert_decimals(items)   

    while 'LastEvaluatedKey' in response:
        response = table.scan(ExclusiveStartKey=response['LastEvaluatedKey'])
        items.extend(response['Items'])
    export_data = json.dumps(items)
    date_string = datetime.now().strftime("%Y-%m-%d")
    file_name = f"{tablename}.json"
    s3.put_object(
        Bucket= s3_bucket_name,
        Key=file_name,
        Body=export_data
    )
    return {
        'statusCode': 200,
        'body': json.dumps(f'Exported {len(items)} items to S3')
    }
Enter fullscreen mode Exit fullscreen mode

2.Add environmental variables attached to lambda

dynamodb_table_name : mytable1
s3export_bucketname : s3 bucket name


3.IAM Role attached to lambda

  • IAM Role Policy
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": "logs:CreateLogGroup",
            "Resource": "arn:aws:logs:AWS_Region:AWS_Account_Id:*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogStream",
                "logs:PutLogEvents"
            ],
            "Resource": [
                "arn:aws:logs:AWS_Region:AWS_Account_Id:log-group:/aws/lambda/dynamodb-export:*"
            ]
        },
       {
            "Action": [
                "dynamodb:Scan",
                "s3:PutObject"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:dynamodb:*:*:table/mytable1"
            ]
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode
  • IAM Role Trust relationship
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Service": "lambda.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

Conclusion: Export of AWS DynamoDB tables values to S3 bucket to share with different accounts for importing purpose

💬 If you enjoyed reading this blog post and found it informative, please take a moment to share your thoughts by leaving a review and liking it 😀 and follow me in dev.to , linkedin

Top comments (0)