A production-ready AWS Lambda template using TypeScript with ECMAScript Modules (ESM), featuring S3-to-PostgreSQL CSV processing, comprehensive testing, and automated code quality workflows.
- TypeScript with ESM: Native ES modules using
.mtsextension andNodeNextmodule resolution - AWS SDK v3: Modern AWS integrations with S3, SQS event handling
- PostgreSQL Integration: Batch CSV ingestion from S3 to PostgreSQL
- Production-Ready: Error handling, retries, configurable batch processing
- Code Quality: Biome for linting/formatting, Lefthook for git hooks, automated CI/CD
- Type Safety: Full TypeScript types with AWS Lambda event definitions
- Testing: Vitest with native ESM support and AWS SDK mocking
- Node.js >= 20 (Lambda runtime:
nodejs20.xornodejs22.x) - pnpm >= 10 (fast, disk-efficient package manager)
- PostgreSQL (for local development and testing)
- AWS Account (for deployment)
pnpm installCompile TypeScript to JavaScript in the dist/ directory:
pnpm run buildpnpm test
# Watch mode for development
pnpm run test:watch# Check code with Biome
pnpm run lint
# Auto-fix issues
pnpm run lint:fix
# Format code
pnpm run formattypescript-lambda-example/
├── src/
│ ├── handler.mts # Main Lambda handler for SQS/S3 events
│ └── s3-to-mssql.mts # S3 CSV to PostgreSQL pipeline
├── tests/
│ └── handler.test.mts # Jest tests with AWS SDK mocks
├── .github/workflows/ # CI/CD automation
├── dist/ # Compiled JavaScript output
├── tsconfig.json # TypeScript ESM configuration
├── jest.config.cjs # Jest configuration for ESM
├── .eslintrc.json # ESLint rules
├── .prettierrc # Prettier formatting
└── package.json # Dependencies and scripts
The main handler (src/handler.mts) processes SQS events where each message contains S3 event
notifications. It:
- Parses SQS messages and extracts S3 event details
- Fetches CSV files from S3
- Streams and parses CSV data
- Batch inserts rows into PostgreSQL
- Handles errors with configurable retry logic
Configure these in your Lambda function:
PGHOST— PostgreSQL hostPGPORT— PostgreSQL port (default: 5432)PGDATABASE— Database namePGUSER— Database userPGPASSWORD— Database passwordTABLE_NAME— Target table (default:csv_raw_rows)BATCH_SIZE— Insert batch size (default: 100)THROW_ON_ERROR— Throw on DB errors for SQS retry (default:true)
{
"Records": [
{
"eventVersion": "2.1",
"eventSource": "aws:s3",
"s3": {
"bucket": { "name": "my-bucket" },
"object": { "key": "data/file.csv" }
}
}
]
}This project uses ECMAScript Modules with TypeScript:
- Source files use
.mtsextension package.jsonsets"type": "module"tsconfig.jsonuses"module": "NodeNext"and"moduleResolution": "NodeNext"- Local imports require explicit
.mjsextensions
import type { SQSEvent } from "aws-lambda";
import { S3Client } from "@aws-sdk/client-s3";
import { myFunction } from "./utils.mjs"; // Note .mjs extensionGit hooks are automatically installed when you run pnpm install. They will:
Pre-commit:
- Run Biome linter and formatter on staged files
- Lint markdown files
- Remove trailing whitespace
Pre-push:
- Run tests
- Build the project
Manual hook execution:
Manual hook execution:
```bash
# Run pre-commit checks
npx lefthook run pre-commit
# Run pre-push checks
npx lefthook run pre-push
## Deployment
### Package for Lambda
Create a deployment package with compiled code and dependencies:
```bash
pnpm run build
zip -r lambda-package.zip dist node_modules package.jsonWindows PowerShell:
pnpm run build
Compress-Archive -Path dist, node_modules, package.json `
-DestinationPath lambda-package.zip- Runtime:
nodejs20.x - Handler:
dist/handler.handler - Architecture:
x86_64orarm64 - Memory: 512 MB (adjust based on CSV size)
- Timeout: 300 seconds (adjust for large files)
- Environment: Set PostgreSQL and processing variables
resource "aws_lambda_function" "csv_processor" {
filename = "lambda-package.zip"
function_name = "s3-csv-to-postgres"
role = aws_iam_role.lambda_role.arn
handler = "dist/handler.handler"
runtime = "nodejs20.x"
timeout = 300
memory_size = 512
environment {
variables = {
PGHOST = var.db_host
PGDATABASE = var.db_name
TABLE_NAME = "csv_data"
BATCH_SIZE = "100"
}
}
}GitHub Actions workflows automate quality checks:
- CI: Build, test, lint on every push/PR
- Code Quality: ESLint, Prettier, TypeScript type checking
- Security: CodeQL analysis, dependency audits
- Semantic Release: Automated versioning and changelogs
- Create a feature branch
- Make changes with semantic commit messages (e.g.,
feat:,fix:) - Ensure tests pass:
pnpm test - Ensure linting passes:
pnpm run lint - Submit pull request
See LICENSE for details.