DEV Community

Chintan Soni
Chintan Soni

Posted on

1 1 1 1 1

Streamlining Database Migrations: From Basic to Ultra Pro Max Approach

Introduction

In the ever-evolving landscape of database management, effective migration strategies can significantly impact project timelines and productivity. Having encountered the need for migrating a substantial amount of data from Dynamo DB to RDS PgSQL DB through the AppSync framework, I embarked on a journey to optimize the process. Through trial and error, I discovered methods that not only saved time but also enhanced data integrity.

The Challenge

Tasked with migrating 4.8L records from Dynamo DB to RDS PgSQL DB using AppSync, the initial approach involved fetching records recursively in batches of 1000 and then iterating over the array to insert them into PgSQL one by one. However, this proved to be time-consuming, taking approximately 1.5 hours for 15K records, means around 48 hours in total.

The Evolution

1. Batched Insert Operation
To improve efficiency, I introduced batching with a size of 1000 records. By collecting promises of insert operations into an array and executing them simultaneously using await Promise.all(), the time required was reduced to 36 hours. Though an improvement, the risk of data loss remained low, contingent on any constraint violations.

2. Bulk Insert Statement
The next level involved rewriting the insert statement to support bulk insert. This approach utilized only one database connection to insert all 1000 records at once, cutting the time down to 18 hours. However, the drawback was that if a single record failed insertion, the entire batch was skipped, posing a higher risk of data loss.

3. Ultra Pro Max Approach
Seeking the optimal solution, I implemented a unique strategy. By making calls to AppSync with pagination of 1000, I set up a local PgSQL database environment and synchronized all 4.8L records in just 1.5 hours. Subsequently, I exported the records into a CSV file using PGAdmin (warning: massive file size!) and imported it into the remote PgSQL instance. Surprisingly, the entire process, including data insertion, took less than 2 hours, showcasing the power of import/export features.

Conclusion:

The moral of this migration tale is clear: leveraging the import/export features of databases can drastically reduce migration times and mitigate the risk of data loss. The journey from a basic iterative approach to an Ultra Pro Max strategy highlights the importance of exploring various techniques to find the most efficient solution for your specific migration needs.

Postmark Image

"Please fix this..."

Focus on creating stellar experiences without email headaches. Postmark's reliable API and detailed analytics make your transactional emails as polished as your product.

Start free

Top comments (0)

Image of Datadog

Keep your GPUs in check

This cheatsheet shows how to use Datadog’s NVIDIA DCGM and Triton integrations to track GPU health, resource usage, and model performance—helping you optimize AI workloads and avoid hardware bottlenecks.

Get the Cheatsheet

👋 Kindness is contagious

Engage with a wealth of insights in this thoughtful article, cherished by the supportive DEV Community. Coders of every background are encouraged to bring their perspectives and bolster our collective wisdom.

A sincere “thank you” often brightens someone’s day—share yours in the comments below!

On DEV, the act of sharing knowledge eases our journey and forges stronger community ties. Found value in this? A quick thank-you to the author can make a world of difference.

Okay