Inserting test data into DynamoDB with NodeJS and TypeScript

Inserting test data into DynamoDB

In this blog post, we’ll guide you through inserting test data into a local DynamoDB instance using NodeJS and TypeScript. We’ll cover the process, including creating a local DynamoDB instance (pre-requisite), and then demonstrate how to load your CSV data using Node.js and TypeScript code.

DynamoDB, a fully managed NoSQL database service offered by AWS, boasts exceptional performance, scalability, and durability. This makes it a popular choice for storing vast quantities of data that require speedy access. One convenient method for populating DynamoDB with test data is through a CSV file. This approach is particularly beneficial when dealing with a significant amount of data.

Prerequisites to inserting test data into DynamoDB

Before we begin, ensure you have the following:

  • A Node.js development environment with npm (or yarn) installed.
  • The aws-sdk library installed (npm install aws-sdk).
  • A CSV file containing your test data.

Note: This guide focuses on interacting with a local DynamoDB instance. For production environments, you’ll utilize the AWS endpoint URL.

Step 1: Setting Up DynamoDB Local (Optional)

If you haven’t already, download and run the DynamoDB local instance from the official AWS documentation: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html
This will create a local DynamoDB instance accessible on your machine.

Step 2: Create a DynamoDB Table

There are two main approaches to create your DynamoDB table:

  1. Using the AWS CLI: You can use the AWS CLI to create the table directly. Refer to the AWS documentation for specific commands: https://docs.aws.amazon.com/cli/latest/reference/dynamodb/create-table.html
  2. Programmatically: You can create the table using Node.js and the aws-sdk library. We’ll cover this approach in the next section.

Step 3: Create the Node.js Project and Install Dependencies

Create a new directory for your project and initialize a package.json file:

mkdir dynamo-data-loader
cd dynamo-data-loader
npm init -y

Next, install the required dependency:

npm install aws-sdk

Step 4: Code Implementation (Node.js and TypeScript)

Create a file named data-loader.ts and add the following code:

import * as AWS from 'aws-sdk';
import * as fs from 'fs'; // Required for file system access

// Configure the DynamoDB endpoint URL (replace with your local endpoint if using DynamoDB Local)
const ddb = new AWS.DynamoDB({ apiVersion: '2012-10-17' });

// Replace with your table name and CSV file path
const tableName = 'your-table-name';
const csvFilePath = './data/user.csv';

async function loadDataFromCSV() {
  try {
    const data = await parseCSV(csvFilePath);

    for (const item of data) {
      await putItemInDynamoDB(tableName, item);
    }

    console.log('Data loaded successfully!');
  } catch (error) {
    console.error('Error loading data:', error);
  }
}

// Function to parse CSV data (implement your logic here)
function parseCSV(filePath: string): Promise<any[]> {
  // Replace with your CSV parsing logic using a library like 'csv-parser'
  return new Promise((resolve, reject) => {
    // ... your CSV parsing implementation
  });
}

async function putItemInDynamoDB(tableName: string, item: any) {
  const params = {
    TableName: tableName,
    Item: item,
  };

  await ddb.putItem(params).promise();
}

loadDataFromCSV();

Explanation:

  • We import the AWS object from the aws-sdk library.
  • Configure the ddb object with the DynamoDB endpoint URL. For local DynamoDB, replace it with http://localhost:8000.
  • Define the tableName and csvFilePath variables (replace with your values).

To parse the CSV data, we’ll use the csv-parser library. Install it:

npm install csv-parser

Then, implement the parseCSV function as follows:

import * as csvParser from 'csv-parser';

function parseCSV(filePath: string): Promise<any[]> {
  return new Promise((resolve, reject) => {
    const results: any[] = [];

    fs.createReadStream(filePath)
      .pipe(csvParser())
      .on('data', (data) => results.push(data))
      .on('end', () => {
        resolve(results);
      })
      .on('error', (error) => {
        reject(error);
      });
  });
}

This function reads the CSV file, parses each line, and returns an array of objects, where each object represents a row from the CSV.

Inserting test data into DynamoDB

The putItemInDynamoDB function is already set up to insert items into DynamoDB. It takes the table name and an item object as input, and then uses the ddb.putItem method to insert the item.

Remember: Ensure that the data structure in your CSV file matches the schema of your DynamoDB table. You might need to convert data types or handle missing values as necessary.

Error Handling and Best Practices

  • Error Handling: Implement robust error handling to catch and handle exceptions gracefully.
  • Batch Writing: For improved performance, consider using batchWriteItem to insert multiple items in a single API call.
  • Data Validation: Validate the data before inserting it into DynamoDB to prevent errors.
  • Rate Limiting: Be aware of DynamoDB’s write capacity limits and adjust your code accordingly to avoid throttling.
  • Backoff and Retry: Implement retry logic for failed operations, with exponential backoff to avoid overwhelming the system.

Conclusion on inserting test data into DynamoDB with NodeJS

By following these guidelines and customizing the code to your specific needs, you can effectively load test data into your DynamoDB tables using Node.js and TypeScript. Once done it’s time to create a BE NodeJS application with Express.

That’s all.
Try it at home!

0
Be the first one to like this.
Please wait...

Leave a Reply

Thanks for choosing to leave a comment.
Please keep in mind that all comments are moderated according to our comment policy, and your email address will NOT be published.
Please do NOT use keywords in the name field. Let's have a personal and meaningful conversation.