A comprehensive search engine for data structures and algorithms, built with React, TypeScript, and Tailwind CSS.
- Search Engine: Find algorithms and data structures by name, category, or complexity
- Detailed Information: Access comprehensive details on time/space complexity, implementation, and use cases
- Examples: View practical examples of algorithm implementation and usage
- Responsive Design: Works seamlessly across desktop and mobile devices
- React 19: The latest version of React with improved performance
- TypeScript: For static type checking
- Tailwind CSS: For utility-first styling
- Vite: For incredibly fast development and build
- React Router: For client-side routing
- Google OAuth: For authentication via Google
- Clone this repository
- Install dependencies with
npm install - Create a
.env.localfile by copying.env.example - Add your Google OAuth credentials to
.env.local(see GOOGLE_AUTH_SETUP.md for details) - Run the development server with
npm run dev
Note: The .env.local file contains sensitive API keys and is ignored by Git. For deployment instructions to properly set up environment variables on hosting platforms, see DEPLOYMENT_ENV_SETUP.md.
src/
├── components/ # Reusable UI components
├── pages/ # Route components
├── services/ # API and service interactions
├── hooks/ # Custom React hooks
├── utils/ # Helper functions
└── types/ # TypeScript type definitions
-
Clone the repository
git clone https://github.com/yourusername/dsa-search-engine.git cd dsa-search-engine -
Install dependencies
npm install
-
Start the development server
npm run dev
-
Build for production
npm run build
-
Deploy to GitHub Pages
npm run deploy
Or simply push to the main branch and GitHub Actions will deploy automatically.
The website includes problems from LeetCode that are loaded from a static JSON file. This file is created by a Python scraper. There are multiple ways to update this data:
Run the scraper script manually when you want to update:
cd backend
python leetcode_scraper.pyThe repository includes a GitHub Action that automatically updates the LeetCode problems once a month (on the 1st day at midnight UTC). This ensures users see up-to-date problems without manual intervention.
You can also trigger manual updates from the GitHub Actions tab in your repository.
- GitHub Action runs automatically on the 1st day of each month
- Problems are scraped from LeetCode API
- Changes are committed automatically if new problems are found
- Vercel deployment updates with the new data
This means the website will stay up to date without you needing to manually run the scraper.
npm run dev- Start the development servernpm run build- Build for productionnpm run lint- Run ESLint to check code qualitynpm run preview- Preview the production build locally
Currently, the app uses mock data. In a production environment, this would be replaced with:
- A backend API serving algorithm data
- A database containing comprehensive algorithm information
- User-contributed examples and implementations
- Add authentication for user contributions
- Implement algorithm visualizations
- Add complexity comparisons between algorithms
- Create a community section for discussions
- Add a feature to save favorite algorithms
MIT License
Made with ❤️ by DSA Search Engine Team