A contact management system built to upload and process CSV files with over 1 million rows.
CSV.mp4
- System should be scalable (ideally serverless)
- Users should be able to upload CSV files with 1 million rows
- Users should be able to see the progress of the file upload in real time
- Next.js - Frontend & backend
- Drizzle - ORM
- tRPC - RPC wrapper around Tanstack Query
- Vercel - NextJS Hosting & Fluid Compute
- Resend - Email service
- React Email - Email templates
- new.email - React email generator
- Upstash Qstash - Queue service
- Pusher - Realtime updates with websockets
- Cloudflare R2 - Object storage
- AWS Aurora Serverless Postgres - Database
- The user uploads a CSV file using a presigned URL.
- The frontend notifies the backend when the file is uploaded and ready to be processed.
- The backend streams in the file from R2.
- As the file is streamed in, it is broken down into chunks.
- We create a record in the database for each chunk to track it's progress and add the chunk to a queue.
- The queue pushes chunks to a serverless function to process parse the chunk and save it's data to the database.
npm install
Note: Make sure you have all the environment variables set up in your
.env
file. Without them, the app won't run.
We use concurrently to run NextJS and the queue server with one command:
npm run dev
If you need to run the queue server separately, you can use the following command:
npx @upstash/qstash-cli dev
You can use docker to run the database locally:
docker pull postgres
docker run -d \
--name dex \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_USER=postgres \
-e POSTGRES_DB=dex \
-e POSTGRES_HOST_AUTH_METHOD=scram-sha-256 \
-e POSTGRES_INITDB_ARGS="--auth-host=scram-sha-256" \
-p 5432:5432 \
postgres
After updating the schema, we can push the changes to the database with the following command:
npm run db:push
Since we are using Vercel, we can just push to the main branch and it will automatically deploy to production.