
End-to-End testing on S3 and DynamoDb
この記事は公開されてから1年以上経過しています。情報が古い可能性がありますので、ご注意ください。
Introduction
The blog describes End-to-End (E2E) Testing of the below scenario.

Here when the operator uploads a JSON file to an S3 bucket, a Lambda function is triggered to read the predefined contents from the JSON file and write the same to the table defined in the DynamoDb.
Pre-conditions
- Playwright: 1.27.1
- The minimum settings with the AWS SDK (like an IAM user who has read-write access to S3, dynamoDb and etc.)
- Apollo Client setup for querying the DynamoDb.
Implementation
The e2e testing is implemented using the Playwright framework.
S3 Instance Generation
A new S3 instance is created, and connection information can be specified. Passing an empty object ({}) will use the settings from the current environment's default profile.
Here is a reference code for creating an S3 instance and refreshObjects function which aids in clearing objects in the bucket.
e2e/src/util/s3-util.ts
import { S3 } from '@aws-sdk/client-s3';
export const testBucketName = 'test-bucket';
export const testJsonFileName = 'testFile.json';
export const buildS3Clients = (): S3 => {
  const s3 = new S3({ region: 'ap-northeast-1' });
  return s3;
};
export const refreshObjects = async ({
  s3,
  bucketName,
}: {
  s3: S3;
  bucketName: string;
}): Promise<void> => {
  const { Contents: contents = [] } = await s3.listObjectsV2({
    Bucket: testBucketName,
  });
  // eslint-disable-next-line no-restricted-syntax
  for (const content of contents) {
    // eslint-disable-next-line no-await-in-loop
    await s3.deleteObject({
      Bucket: bucketName,
      Key: content.Key!,
    });
  }
};
e2e Test on S3 and DynamoDb
Here, In the test-case we execute the s3.putObject command which aids in uploading the object into the specified bucket. This in-turn triggers a lambda function. As the result we expect the data in the JSON file to be inserted into the dynamoDb table. It is verified by querying the Table.
e2e/src/api-test/data-registration.test.ts
import { expect, test } from '@playwright/test';
import {
  buildS3Clients,
  testBucketName,
  testJsonFileName,
  refreshObjects,
} from '../util/s3-util';
const s3 = buildS3Clients();
const TestJsonData = [
  {
    id: '12345',
    name: 'Natasha',
    age: 10,
    country: 'America',
  },
];
test.beforeEach(async () => {
  await refreshObjects({ s3, bucketName: testBucketName });
});
test('data register test', async () => {
  await s3.putObject({
    Bucket: testBucketName,
    Key: testJsonFileName,
    Body: JSON.stringify(TestJsonData),
  });
  const { Contents: contents = [] } = await s3.listObjectsV2({
    Bucket: testBucketName,
  });
  expect(contents[0].Key).toBe('testFile.json');
  //querying using ApolloClient
  const client = buildApolloClient({ idToken });
  const result = await client.query({
    query: gql`
      query MyQuery {
        Test {
          id
          name
          age
          country
        }
      }
    `,
  });
  expect(result.data.test[0]).toMatchObject({
    id: '12345',
    name: 'Natasha',
    age: 10,
    country: 'America',
  });
});
Happy Learning!













