LinkedUp is a LinkedIn-style social networking application built to explore real-world backend and full-stack concepts such as users, connections, posts, messaging, and conversations.
This repository includes the database schema backup required to run the project locally.
Backend: Node.js / Express
Database: PostgreSQL
Frontend: React / Next.js
Auth: Sessions / JWT
Microservice: Java Spring Boot
Caching*: Redis
- Backend (https://github.com/themechbro/linkedup-backend)
- Microservice (https://github.com/themechbro/linkedup_microservice)
From 2 Feb 2026 onwards, Redis is used in LinkedUp as a high-speed, in-memory layer to reduce PostgreSQL load, accelerate feed reads, and power short-lived realtime signals without writing transient state to the database.
Why Redis here?
-
Feed generation is read-heavy and expensive to recompute (connections → posts → enrichment → pagination)
-
Profile and connection data are reused across many APIs
-
Realtime UX (e.g., typing indicators) should not touch the DB
-
Like counts are frequently read, rarely changed
What we cache
| Use Case | Key Pattern | TTL | Purpose |
|---|---|---|---|
| Feed cache | feed:connections:{user_id} |
60–120 sec | Cache fully enriched paginated feed (fan-out on read optimization) |
| User profile | user:profile:{user_id} |
5–10 min | Avoid repeated profile lookups during feed/comment rendering |
| User connections | user:connections:{user_id} |
5 min | Reused for feed building and visibility checks |
| Typing indicator | typing:{conversation_id}:{user_id} |
5 sec | Realtime “user is typing” without DB writes |
| Post like count | post:likes:{post_id} |
2–5 min | Reduce repeated count reads from likes service/DB |
Performance Benchmarks
Before Redis (No Cache)
- First Request: ~500ms
- Subsequent Requests: ~500ms
- DB Queries per Request: 2-3
- Microservice Calls: 1 per request
After Redis (With Cache)
- First Request (Cache Miss): ~500ms
- Cached Requests (Cache Hit): ~20ms ⚡
- DB Queries per Request: 0 (when cached)
- Microservice Calls: 0 (when cached)
- Performance Improvement: ~25x faster
Cache Hit Rate Optimization To maximize cache hit rate:
- Use consistent pagination: Encourage users to use standard limit values (10, 20, 50)
- Monitor cache stats: Track hit/miss ratios
- Adjust TTL: Balance freshness vs performance
- Pre-warm cache: Cache feeds for active users during off-peak hours
The project ships with a PostgreSQL SQL file that defines the entire database structure required for the application to run.
-
Tables
-
Columns and data types
-
Primary keys & foreign keys
-
Indexes
-
Sequences
-
Triggers and functions
⚠️ The file may also contain demo/sample data. This is intentional for development convenience.
CREATE DATABASE linkedup;
- Using psql:
psql -U postgres -d linkedup -f linkedup_schema.sql
- Or using pgAdmin:
Create an empty database named linkedup
Right-click the database → Restore
Select linkedup_schema.sql
- Restore
Create a .env file in the backend root:
- DB_HOST=localhost
- DB_PORT=5432
- DB_NAME=linkedup
- DB_USER=postgres
- DB_PASSWORD=your_password
npm install npm run dev
Key tables include:
-
users – user profiles and authentication data
-
posts – user posts
-
comments – post comments
-
connections – accepted user connections
-
connection_requests – pending connection requests
-
messages – chat messages
-
conversations – user conversations
-
education – education details
-
jobs – job / experience details
-
session – session tracking
As of know, Medias's (Images and Videos) are stored locally. That means, after downloading the linkedup-backend repo (Express server), you have to create a folder named Uploads which contains 2 more folders inside it images and videos. The structure is given in the image below 👇
The database schema is version-controlled via SQL, not migrations.
If you modify the schema, regenerate the SQL backup before committing.
For production setups, consider splitting:
-
schema.sql
-
seed.sql
All the specification is mentioned here in linkedup repo, that does not mean this is the final repo. Backend and Microservice has there dedicated repo which I have mentioned above, clone that too for the working of this project. Merging of this whole project will be done after its completeion only.
This project is for educational and learning purposes.
Inspired by real-world social networking platforms to practice scalable backend and database design.