I'm learning LLMs and Agentic AI by building something
NOTE: This project is currently in alpha. In fact, it's very alpha. This means it is still under active development and may undergo significant changes. Features may be incomplete or unstable. Got suggestions on what you would like to see or how to make it better? Add an issue and let us know!
- Overview
- Project Goals
- Features
- Why This Project Was Built
- Technologies Used
- Prerequisites
- First-Time Project Setup
- Running the Project Locally
- Development Tasks
- Install Nx Console
- Projects
- Contributing
- License
- Acknowledgments
- Useful links
- FAQs
Erisfy is an AI-powered stock screener that simplifies investing by transforming raw market data into clear, actionable insights, enabling investors to discover opportunities, analyze trends, and make confident decisions faster. The project aims to integrate Large Language Models (LLM) and Agentic AI into modern TypeScript-based web products. The project is in its early alpha stage and is a work in progress. The goal is to learn by doing, creating a functional application that showcases the potential of LLM and Agentic AI in FinTech.
- Learning Through Building: Create a practical application that serves as a learning platform for integrating Large Language Models (LLM) and Agentic AI into modern web applications.
- AI-Powered Stock Screening: Build an intelligent system that transforms complex market data into clear, actionable insights for investors at all levels.
- Modular AI Architecture: Implement a flexible, modular agent architecture with weighted filters and rule sets that can evolve through feedback loops.
- Developer Education: Document the journey of building AI-powered features, sharing insights and lessons learned with the developer community.
- Accessible Investment Intelligence: Democratize financial analysis by making AI-driven insights available and understandable to everyday investors.
- User-Centric Design: Create an interface that simplifies complex financial data without overwhelming users, focusing on clear explanations and guided experiences.
- Transparent AI Decision-Making: Implement explainable AI features that help users understand the reasoning behind investment recommendations.
- Adaptive Learning System: Develop AI filters that adjust based on user preferences and investment styles, creating a more personalized experience.
For a detailed breakdown of the project's features and roadmap, see the Implementation Plan and Modular Agent Architecture documentation.
- 🤖 AI-Powered Analysis: Transforms complex market data into clear, actionable insights using LLMs and Agentic AI
- 🎯 Modular Agent Architecture: Flexible AI agents with weighted filters and rule sets that evolve through feedback
- 🔍 Smart Stock Screening: Intelligent system for discovering investment opportunities based on customizable criteria
- 📊 Natural Language Insights: Plain-language explanations of market trends and investment recommendations
- 🎓 Learning-Focused Design: Educational components that help users understand investment concepts and AI decisions
- ⚡ Adaptive Filtering: Dynamic filters that adjust based on user preferences and investment styles
- 🛠 Modern Tech Stack: Built with React, TypeScript, and Nx in a scalable monorepo architecture
- 🎨 User-Centric UI: Clean interface using Tailwind CSS and shadcn/ui for consistent design patterns
- 🧪 Comprehensive Testing: End-to-end and unit testing with Playwright and Vitest
- 📱 Responsive Design: Mobile-first approach ensuring accessibility across all devices
- 🔄 Continuous Integration: Automated builds and testing through GitHub Actions
- 📚 Extensive Documentation: Detailed documentation of architecture, features, and development practices
Large Language Models (LLMs) and Agentic AI represent powerful technologies that are transforming software development and user experiences. However, the best way to truly understand and harness these technologies is through hands-on application to real-world problems. This project was born from that philosophy of learning by doing.
Erisfy serves as a practical learning platform where we're actively exploring and implementing:
- Integration of LLMs for natural language processing and user interactions
- Development of modular AI agents with specific, focused capabilities
- Creation of weighted filtering systems that learn and adapt from user feedback
- Implementation of transparent, explainable AI decision-making processes
- Building AI-driven features that solve real user problems in the investment space
Rather than just experimenting with AI in isolation, we're building a functional, user-centric application that demonstrates how these technologies can make complex financial data more accessible and actionable for everyday investors. This approach allows us to:
- Learn through practical implementation challenges
- Document and share insights about integrating AI into modern web applications
- Create reusable patterns for AI-driven features
- Build something genuinely useful while advancing our understanding of AI technologies
By focusing on a real-world application in the FinTech space, we ensure our learning is grounded in practical problems and user needs, rather than theoretical exercises. The project aims to be both a learning journey and a useful tool, demonstrating how AI can democratize financial analysis and decision-making.
To use Erisfy, make sure you have the following installed and configured:
- Node.js v20.14.0 (or higher)
- pnpm v9.15.0 (or higher)
- Docker Desktop (required for certain services)
- API Keys (required for core functionality)
- API Key Setup Guide - Complete guide for obtaining necessary API keys and exploring AI model alternatives
- The News API key - Required for market news data
- OpenAI API key - Required for AI analysis (or see our guide for alternative models)
- Financial Datasets API key - Required for stock market data and financial metrics
- Tavily API key - real-time accurate search results tailored for LLMs and RAG
The server application requires environment variables to be configured. We use different environment files for development and production:
.env.development
: Local development environment (not committed to source control).env.production
: Production environment with placeholders (populated by CI pipeline).env.example
: Example configuration template
Follow these steps for local development:
-
Navigate to the server application directory:
cd apps/server
-
Copy the example environment file to create your development environment:
cp .env.example .env.development
-
Edit
.env.development
and set your environment variables:PORT
: The server port (defaults to 3001)NODE_ENV
: The environment (development/production/test)OPENAI_API_KEY
: Your OpenAI API key (required)
Note: The .env.development
file contains sensitive information and is not committed to the repository. Each developer needs to maintain their own .env.development
file locally.
When running the application locally with Docker, ensure your server's database configuration matches the Docker settings:
# Database connection settings in .env.development
DATABASE_URL="postgresql://postgres:postgres@localhost:5432/erisfydb?schema=public"
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=erisfydb
POSTGRES_PORT=5432
POSTGRES_HOST=localhost
These environment variables should match the values in docker-compose.yml
. If you modify the Docker configuration, update your environment variables accordingly.
When running the database for the first time:
-
Start the Docker container:
pnpm run serve:docker
-
Run database migrations:
nx run server:prisma-migrate
This will create the necessary database schema and apply any pending migrations.
If you modify the Prisma schema, create and apply a new migration:
pnpm run prisma:migrate
If you have changes pending in the DB (e.g., from another branch), pull them into your local schema:
npx prisma db pull
Then update the generated Prisma client:
npx prisma generate
If you encounter errors:
- Verify Docker is running and ports are not in use
- Check logs:
docker-compose logs db
- Confirm environment variables match Docker settings
Note: Nx runs Prisma commands in a non-interactive shell, so you may not be prompted for a migration name. You can work around this by:
Running Prisma directly in an interactive shell:
npx prisma migrate dev --name your_migrationPassing the
--name
argument via Nx:nx run server:prisma-migrate -- --name your_migration
Follow these steps when setting up the project for the first time:
-
Ensure prerequisites are installed:
- Node.js v20.14.0 (or higher)
- pnpm v9.15.0 (or higher)
- Docker Desktop
- Required API keys (see API Key Setup Guide)
-
Clone the repository:
git clone https://github.com/CambridgeMonorail/erisfy.git cd erisfy
-
Install dependencies:
pnpm install
-
Start Docker Desktop application
-
Run the automated setup script:
node scripts/dev-setup.js
This script will:
- Prompt for required environment variables (OpenAI API key, etc.)
- Create necessary environment files
- Start Docker containers
- Run initial database migrations
Follow these steps every time you want to run the project:
-
Start Docker Desktop (if not already running)
-
Start the Docker containers:
pnpm run serve:docker
Wait until you see the message that the database is ready.
-
In a new terminal, start the server:
pnpm run serve:server
Wait until you see the message that the server is running.
-
In another new terminal, start the client:
pnpm run serve:client
Your application should now be running at:
- Client: http://localhost:4200
- Server: http://localhost:3001
- Database Admin (Adminer): http://localhost:8080
To stop the application:
- Press Ctrl+C in each terminal window (client and server)
- Stop the Docker containers with Ctrl+C in the Docker terminal
- (Optional) Close Docker Desktop if you're done developing
If you encounter issues while running the project:
-
Port Conflicts:
- Check if ports 3001, 4200, 5432, or 8080 are already in use
- Stop any conflicting services or change the ports in your environment files
-
Database Connection Issues:
- Ensure Docker containers are running:
docker ps
- Check Docker logs:
docker-compose logs db
- Verify database connection settings in
.env.development
- Ensure Docker containers are running:
-
Server Won't Start:
- Verify Docker is running and database container is healthy
- Check server logs for specific error messages
- Verify environment variables in
.env.development
-
Client Won't Start:
- Check if the server is running and accessible
- Verify environment variables in client's
.env
- Clear browser cache and reload
-
"Database Not Ready" Errors:
-
Wait a few more seconds for the database to initialize
-
If persistent, try restarting the Docker containers:
docker-compose down pnpm run serve:docker
-
To install and set up the project, follow these steps:
-
Clone the repository:
git clone https://github.com/CambridgeMonorail/erisfy.git
-
Navigate to the project directory:
cd erisfy
-
Install dependencies:
pnpm install
-
Run the development setup script:
node scripts/dev-setup.js
This script will:
- Prompt for required environment variables (OpenAI API key, etc.)
- Create necessary environment files
- Start Docker containers
- Run initial database migrations
Note: Make sure Docker Desktop is running before executing the setup script.
Once the setup is complete, you can start the application:
-
Start the server:
pnpm run serve:server
-
In a new terminal, start the client:
pnpm run serve:client
The client will run on port 4200 and should automatically open in your default browser.
The project uses Docker Compose to manage the PostgreSQL database service and Adminer for database management. The configuration is defined in docker-compose.yml
:
version: '3'
services:
db:
image: postgres:15
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=erisfydb
ports:
- "5432:5432"
adminer:
image: adminer
ports:
- "8080:8080"
depends_on:
- db
This configuration:
- Uses PostgreSQL version 15
- Creates a database named 'erisfydb'
- Sets up default credentials (username: postgres, password: postgres)
- Maps port 5432 on your host machine to port 5432 in the container
- Includes Adminer for database management
Adminer:
- Runs at http://localhost:8080
- System: PostgreSQL
- Server: db
- Username: postgres
- Password: postgres
- Database: erisfydb
- Start the container:
pnpm run serve:docker
- Stop the container: Press Ctrl+C in the terminal running Docker
- Remove the container:
docker-compose down
- Remove container and data volume:
docker-compose down -v
-
Port Conflicts: If port 5432 is already in use:
- Check for running PostgreSQL instances:
docker ps
- Stop any conflicting services
- Or modify the port mapping in docker-compose.yml
- Check for running PostgreSQL instances:
-
Container Won't Start:
- Check Docker Desktop is running
- Try removing the container:
docker-compose down
- Check Docker logs:
docker-compose logs db
-
Data Persistence:
- Data is stored in a Docker volume by default
- To start fresh, remove the volume:
docker-compose down -v
pnpm run serve:docker
: Start the PostgreSQL database containerpnpm run serve:server
: Start the backend server in development modepnpm run serve:client
: Start the frontend client in development modepnpm run build:client
: Create a production bundle of the clientpnpm run build:server
: Create a production bundle of the server
To see all available targets to run for a project:
npx nx show project client
These targets are either inferred automatically or defined in the project.json
or package.json
files.
More about running tasks in the docs »
The following scripts are available to manage and build the project:
-
Build
pnpm run build:affected
: Build only the affected projects.pnpm run build:all
: Build all projects.pnpm run build:client
: Build the client application.pnpm run build:shadcnui
: Build the shadcnui library.
-
Clean
pnpm run clean
: Clean all projects.
-
Format
pnpm run format:check
: Check the formatting of the code.pnpm run format
: Format the code.
-
Lint
pnpm run lint:affected
: Lint only the affected projects.pnpm run lint:all
: Lint all projects.pnpm run lint:client
: Lint the client application.pnpm run lint:shadcnui
: Lint the shadcnui library.
-
Precommit
pnpm run precommit
: Run lint, type-check, build, and test for all projects before committing.
-
Prepare
pnpm run prepare
: Prepare Husky for Git hooks.
-
Serve
pnpm run serve:client
: Serve the client application.pnpm run serve:storybook
: Serve the Storybook instance.
-
Test
pnpm run test:affected
: Test only the affected projects.pnpm run test:all
: Test all projects.pnpm run test:client
: Test the client application.pnpm run test:shadcnui
: Test the shadcnui library.
-
Type-check
pnpm run type-check:affected
: Type-check only the affected projects.pnpm run type-check:all
: Type-check all projects.pnpm run type-check:client
: Type-check the client application.pnpm run type-check:shadcnui
: Type-check the shadcnui library.
Nx Console is an editor extension that enriches your developer experience. It lets you run tasks, generate code, and improves code autocompletion in your IDE. It is available for VSCode and IntelliJ.
- Erisfy: The main project integrating LLM and Agentic AI into a modern TypeScript-based web product.
For detailed information on the projects, refer to the supporting documentation in the docs/specs
directory:
- 01 - Elevator Pitch
- 02 - Pitch
- 03 - Competitor Analysis
- 04 - Product Requirements Document (PRD)
- 05 - Technical Requirements Document (TRD)
- 06 - Implementation Plan
- 07 - Modular Agent Architecture
To learn how to theme your app using Shadcn UI and Tailwind CSS, please refer to the detailed guide in docs/theming-a-new-app.md.
Note: The current theme was generated using the Ready.js Shadcn UI Theme Generator.
To add a new component page to the routing in your React SPA, please refer to the detailed guide in docs/adding-new-component-page.md.
The client application (apps/client
) includes API mocking capabilities using Mock Service Worker (MSW). This feature is particularly important as it allows us to:
- Deploy and showcase the frontend on GitHub Pages without requiring a backend
- Develop and test frontend features independently of the backend
- Present working demos of the application for review before backend deployment
- Enable frontend developers to work without a local backend setup
The API mocking configuration is specific to the client application located in apps/client
. To enable or disable the mocks:
-
Open the
.env
file in theapps/client
directory -
Set the
VITE_REACT_APP_USE_MOCKS
environment variable:# Enable mocks for GitHub Pages deployment or local frontend-only development VITE_REACT_APP_USE_MOCKS=true # Disable mocks when working with the actual backend VITE_REACT_APP_USE_MOCKS=false
When mocks are enabled, the client application will intercept API requests and return mock data instead of attempting to communicate with the backend server. This is particularly useful for:
- GitHub Pages deployments where no backend is available
- Frontend development and testing
- Creating demonstrations and previews
- UI/UX reviews and demonstrations
The mock service implementation for the client is located in:
apps/client/src/mocks/
- Main mocking setup and handlersapps/client/src/mocks/handlers/
- API endpoint mock implementationsapps/client/src/mocks/data/
- Mock data responses
To add or modify mock endpoints:
- Create or update handler files in
apps/client/src/mocks/handlers/
- Define mock data in
apps/client/src/mocks/data/
- Register handlers in
apps/client/src/mocks/browser.ts
Example mock handler:
// apps/client/src/mocks/handlers/stockHandler.ts
rest.get('/api/stocks', (req, res, ctx) => {
return res(ctx.status(200), ctx.json(mockStocksData));
});
When deploying to GitHub Pages, these mocks ensure that:
- The application remains functional without a backend
- Features can be demonstrated to stakeholders
- UI/UX can be reviewed in a production-like environment
- Frontend iterations can be quickly deployed and tested
First, ensure that the mock server is set up in the apps/client/src/mocks/server.ts
file.
Next, import and start the mock server in your test setup file (apps/client/src/test/setup.ts
):
import { server } from '../mocks/server';
beforeAll(() => server.listen());
afterEach(() => server.resetHandlers());
afterAll(() => server.close());
Finally, write your tests as usual, and MSW will intercept the API requests and return the mock responses.
When working on frontend features, you can choose between using mocked or real API endpoints:
-
Local Development with Mocks (No Backend Required):
# apps/client/.env.development VITE_REACT_APP_USE_MOCKS=true
Perfect for:
- Initial UI development
- Prototyping new features
- Working offline
- Frontend-only changes
-
Full Stack Development (Backend Required):
# apps/client/.env.development VITE_REACT_APP_USE_MOCKS=false
Used when:
- Testing real API integration
- Developing backend features
- Validating end-to-end functionality
-
Production Deployment:
# apps/client/.env.production VITE_REACT_APP_USE_MOCKS=false
Always use real API endpoints in production.
-
GitHub Pages Deployment:
# apps/client/.env.production VITE_REACT_APP_USE_MOCKS=true
Enables demonstration of frontend features without backend deployment.
For detailed steps on setting up MSW and integrating API mocking in Erisfy, refer to the Integrating API Mocking with MSW in Erisfy documentation.
Contributions are welcome! Please open an issue or submit a pull request for any changes. For detailed guidelines on how to contribute, see Contributing.
This project is licensed under the MIT License.
- joshuarobs/nx-shadcn-ui-monorepo
- Shadcn UI
- Nx
- Shadcn UI Theme Generator: A tool for generating themes for Shadcn UI.
Learn more:
- Learn more about this workspace setup
- Learn about Nx on CI
- Releasing Packages with Nx release
- What are Nx plugins?
It's not finished yet.
It's not finished yet.
It's not finished yet.
It's not finished yet.
It's not finished yet.
It's not finished yet.
It's not finished yet. (Just kidding, that's actually due to Rayleigh scattering.)
If you encounter issues during setup:
-
Setup Script Fails:
- Ensure Docker Desktop is running
- Check that ports 3001 (server) and 5432 (database) are available
- Verify your API keys are valid (see our API Key Setup Guide)
- For API alternatives and self-hosted options, refer to our setup guide
-
Manual Setup: If you prefer to set up manually or the script fails, follow these steps:
a. Create
apps/server/.env.development
:PORT=3001 NODE_ENV=development OPENAI_API_KEY=your_openai_api_key DATABASE_URL=postgresql://postgres:postgres@localhost:5432/erisfydb?schema=public POSTGRES_USER=postgres POSTGRES_PASSWORD=postgres POSTGRES_DB=erisfydb POSTGRES_PORT=5432 POSTGRES_HOST=localhost
b. Create
apps/client/.env
:VITE_REACT_APP_USE_MOCKS=false
c. Start Docker and run migrations:
pnpm run serve:docker nx run server:prisma-migrate
-
Database Issues:
-
If migrations fail, try resetting the database:
docker-compose down -v pnpm run serve:docker nx run server:prisma-migrate
-