Skip to content

Latest commit

 

History

History
782 lines (552 loc) · 29.3 KB

README.md

File metadata and controls

782 lines (552 loc) · 29.3 KB

Erisfy

Erisfy Logo

I'm learning LLMs and Agentic AI by building something

Project Status Version Build Status License Last Commit

NOTE: This project is currently in alpha. In fact, it's very alpha. This means it is still under active development and may undergo significant changes. Features may be incomplete or unstable. Got suggestions on what you would like to see or how to make it better? Add an issue and let us know!

Table of Contents

Overview

Erisfy is an AI-powered stock screener that simplifies investing by transforming raw market data into clear, actionable insights, enabling investors to discover opportunities, analyze trends, and make confident decisions faster. The project aims to integrate Large Language Models (LLM) and Agentic AI into modern TypeScript-based web products. The project is in its early alpha stage and is a work in progress. The goal is to learn by doing, creating a functional application that showcases the potential of LLM and Agentic AI in FinTech.

Project Goals

  • Learning Through Building: Create a practical application that serves as a learning platform for integrating Large Language Models (LLM) and Agentic AI into modern web applications.
  • AI-Powered Stock Screening: Build an intelligent system that transforms complex market data into clear, actionable insights for investors at all levels.
  • Modular AI Architecture: Implement a flexible, modular agent architecture with weighted filters and rule sets that can evolve through feedback loops.
  • Developer Education: Document the journey of building AI-powered features, sharing insights and lessons learned with the developer community.
  • Accessible Investment Intelligence: Democratize financial analysis by making AI-driven insights available and understandable to everyday investors.
  • User-Centric Design: Create an interface that simplifies complex financial data without overwhelming users, focusing on clear explanations and guided experiences.
  • Transparent AI Decision-Making: Implement explainable AI features that help users understand the reasoning behind investment recommendations.
  • Adaptive Learning System: Develop AI filters that adjust based on user preferences and investment styles, creating a more personalized experience.

For a detailed breakdown of the project's features and roadmap, see the Implementation Plan and Modular Agent Architecture documentation.

Features

  • 🤖 AI-Powered Analysis: Transforms complex market data into clear, actionable insights using LLMs and Agentic AI
  • 🎯 Modular Agent Architecture: Flexible AI agents with weighted filters and rule sets that evolve through feedback
  • 🔍 Smart Stock Screening: Intelligent system for discovering investment opportunities based on customizable criteria
  • 📊 Natural Language Insights: Plain-language explanations of market trends and investment recommendations
  • 🎓 Learning-Focused Design: Educational components that help users understand investment concepts and AI decisions
  • Adaptive Filtering: Dynamic filters that adjust based on user preferences and investment styles
  • 🛠 Modern Tech Stack: Built with React, TypeScript, and Nx in a scalable monorepo architecture
  • 🎨 User-Centric UI: Clean interface using Tailwind CSS and shadcn/ui for consistent design patterns
  • 🧪 Comprehensive Testing: End-to-end and unit testing with Playwright and Vitest
  • 📱 Responsive Design: Mobile-first approach ensuring accessibility across all devices
  • 🔄 Continuous Integration: Automated builds and testing through GitHub Actions
  • 📚 Extensive Documentation: Detailed documentation of architecture, features, and development practices

Why This Project Was Built

Large Language Models (LLMs) and Agentic AI represent powerful technologies that are transforming software development and user experiences. However, the best way to truly understand and harness these technologies is through hands-on application to real-world problems. This project was born from that philosophy of learning by doing.

Erisfy serves as a practical learning platform where we're actively exploring and implementing:

  • Integration of LLMs for natural language processing and user interactions
  • Development of modular AI agents with specific, focused capabilities
  • Creation of weighted filtering systems that learn and adapt from user feedback
  • Implementation of transparent, explainable AI decision-making processes
  • Building AI-driven features that solve real user problems in the investment space

Rather than just experimenting with AI in isolation, we're building a functional, user-centric application that demonstrates how these technologies can make complex financial data more accessible and actionable for everyday investors. This approach allows us to:

  • Learn through practical implementation challenges
  • Document and share insights about integrating AI into modern web applications
  • Create reusable patterns for AI-driven features
  • Build something genuinely useful while advancing our understanding of AI technologies

By focusing on a real-world application in the FinTech space, we ensure our learning is grounded in practical problems and user needs, rather than theoretical exercises. The project aims to be both a learning journey and a useful tool, demonstrating how AI can democratize financial analysis and decision-making.

Technologies Used

React TypeScript Node.js NestJS PostgreSQL Prisma Docker Nx Markdown pnpm Vite GitHub GitHub Actions Tailwind CSS shadcn/ui React Router Vitest Playwright Storybook Jest ESLint Visual Studio Code GitHub Copilot

Prerequisites

To use Erisfy, make sure you have the following installed and configured:

  • Node.js v20.14.0 (or higher)
  • pnpm v9.15.0 (or higher)
  • Docker Desktop (required for certain services)
  • API Keys (required for core functionality)
    • API Key Setup Guide - Complete guide for obtaining necessary API keys and exploring AI model alternatives
    • The News API key - Required for market news data
    • OpenAI API key - Required for AI analysis (or see our guide for alternative models)
    • Financial Datasets API key - Required for stock market data and financial metrics
    • Tavily API key - real-time accurate search results tailored for LLMs and RAG

Environment Configuration

The server application requires environment variables to be configured. We use different environment files for development and production:

  • .env.development: Local development environment (not committed to source control)
  • .env.production: Production environment with placeholders (populated by CI pipeline)
  • .env.example: Example configuration template

Follow these steps for local development:

  1. Navigate to the server application directory:

    cd apps/server
  2. Copy the example environment file to create your development environment:

    cp .env.example .env.development
  3. Edit .env.development and set your environment variables:

    • PORT: The server port (defaults to 3001)
    • NODE_ENV: The environment (development/production/test)
    • OPENAI_API_KEY: Your OpenAI API key (required)

Note: The .env.development file contains sensitive information and is not committed to the repository. Each developer needs to maintain their own .env.development file locally.

Database Connection Settings

When running the application locally with Docker, ensure your server's database configuration matches the Docker settings:

# Database connection settings in .env.development
DATABASE_URL="postgresql://postgres:postgres@localhost:5432/erisfydb?schema=public"
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=erisfydb
POSTGRES_PORT=5432
POSTGRES_HOST=localhost

These environment variables should match the values in docker-compose.yml. If you modify the Docker configuration, update your environment variables accordingly.

First-Time Database Setup

When running the database for the first time:

  1. Start the Docker container:

    pnpm run serve:docker
  2. Run database migrations:

    nx run server:prisma-migrate

This will create the necessary database schema and apply any pending migrations.

Updating the Database & Troubleshooting

If you modify the Prisma schema, create and apply a new migration:

pnpm run prisma:migrate

If you have changes pending in the DB (e.g., from another branch), pull them into your local schema:

npx prisma db pull

Then update the generated Prisma client:

npx prisma generate

If you encounter errors:

  • Verify Docker is running and ports are not in use
  • Check logs: docker-compose logs db
  • Confirm environment variables match Docker settings

Note: Nx runs Prisma commands in a non-interactive shell, so you may not be prompted for a migration name. You can work around this by:

  • Running Prisma directly in an interactive shell:

    npx prisma migrate dev --name your_migration
  • Passing the --name argument via Nx:

    nx run server:prisma-migrate -- --name your_migration

First-Time Project Setup

Follow these steps when setting up the project for the first time:

  1. Ensure prerequisites are installed:

    • Node.js v20.14.0 (or higher)
    • pnpm v9.15.0 (or higher)
    • Docker Desktop
    • Required API keys (see API Key Setup Guide)
  2. Clone the repository:

    git clone https://github.com/CambridgeMonorail/erisfy.git
    cd erisfy
  3. Install dependencies:

    pnpm install
  4. Start Docker Desktop application

  5. Run the automated setup script:

    node scripts/dev-setup.js

    This script will:

    • Prompt for required environment variables (OpenAI API key, etc.)
    • Create necessary environment files
    • Start Docker containers
    • Run initial database migrations

Running the Project Locally

Follow these steps every time you want to run the project:

  1. Start Docker Desktop (if not already running)

  2. Start the Docker containers:

    pnpm run serve:docker

    Wait until you see the message that the database is ready.

  3. In a new terminal, start the server:

    pnpm run serve:server

    Wait until you see the message that the server is running.

  4. In another new terminal, start the client:

    pnpm run serve:client

Your application should now be running at:

To stop the application:

  1. Press Ctrl+C in each terminal window (client and server)
  2. Stop the Docker containers with Ctrl+C in the Docker terminal
  3. (Optional) Close Docker Desktop if you're done developing

Troubleshooting Local Development

If you encounter issues while running the project:

  1. Port Conflicts:

    • Check if ports 3001, 4200, 5432, or 8080 are already in use
    • Stop any conflicting services or change the ports in your environment files
  2. Database Connection Issues:

    • Ensure Docker containers are running: docker ps
    • Check Docker logs: docker-compose logs db
    • Verify database connection settings in .env.development
  3. Server Won't Start:

    • Verify Docker is running and database container is healthy
    • Check server logs for specific error messages
    • Verify environment variables in .env.development
  4. Client Won't Start:

    • Check if the server is running and accessible
    • Verify environment variables in client's .env
    • Clear browser cache and reload
  5. "Database Not Ready" Errors:

    • Wait a few more seconds for the database to initialize

    • If persistent, try restarting the Docker containers:

      docker-compose down
      pnpm run serve:docker

Installation

To install and set up the project, follow these steps:

  1. Clone the repository:

    git clone https://github.com/CambridgeMonorail/erisfy.git
  2. Navigate to the project directory:

    cd erisfy
  3. Install dependencies:

    pnpm install
  4. Run the development setup script:

    node scripts/dev-setup.js

    This script will:

    • Prompt for required environment variables (OpenAI API key, etc.)
    • Create necessary environment files
    • Start Docker containers
    • Run initial database migrations

    Note: Make sure Docker Desktop is running before executing the setup script.

Running Locally

Once the setup is complete, you can start the application:

  1. Start the server:

    pnpm run serve:server
  2. In a new terminal, start the client:

    pnpm run serve:client

The client will run on port 4200 and should automatically open in your default browser.

Docker Configuration

The project uses Docker Compose to manage the PostgreSQL database service and Adminer for database management. The configuration is defined in docker-compose.yml:

version: '3'
services:
  db:
    image: postgres:15
    environment:
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
      - POSTGRES_DB=erisfydb
    ports:
      - "5432:5432"
  adminer:
    image: adminer
    ports:
      - "8080:8080"
    depends_on:
      - db

This configuration:

  • Uses PostgreSQL version 15
  • Creates a database named 'erisfydb'
  • Sets up default credentials (username: postgres, password: postgres)
  • Maps port 5432 on your host machine to port 5432 in the container
  • Includes Adminer for database management

Adminer:

  • Runs at http://localhost:8080
  • System: PostgreSQL
  • Server: db
  • Username: postgres
  • Password: postgres
  • Database: erisfydb

Managing the Docker Container

  • Start the container: pnpm run serve:docker
  • Stop the container: Press Ctrl+C in the terminal running Docker
  • Remove the container: docker-compose down
  • Remove container and data volume: docker-compose down -v

Troubleshooting Docker Issues

  1. Port Conflicts: If port 5432 is already in use:

    • Check for running PostgreSQL instances: docker ps
    • Stop any conflicting services
    • Or modify the port mapping in docker-compose.yml
  2. Container Won't Start:

    • Check Docker Desktop is running
    • Try removing the container: docker-compose down
    • Check Docker logs: docker-compose logs db
  3. Data Persistence:

    • Data is stored in a Docker volume by default
    • To start fresh, remove the volume: docker-compose down -v

Development Commands Quick Reference

  • pnpm run serve:docker: Start the PostgreSQL database container
  • pnpm run serve:server: Start the backend server in development mode
  • pnpm run serve:client: Start the frontend client in development mode
  • pnpm run build:client: Create a production bundle of the client
  • pnpm run build:server: Create a production bundle of the server

To see all available targets to run for a project:

npx nx show project client

These targets are either inferred automatically or defined in the project.json or package.json files.

More about running tasks in the docs »

Development Tasks

The following scripts are available to manage and build the project:

  • Build

    • pnpm run build:affected: Build only the affected projects.
    • pnpm run build:all: Build all projects.
    • pnpm run build:client: Build the client application.
    • pnpm run build:shadcnui: Build the shadcnui library.
  • Clean

    • pnpm run clean: Clean all projects.
  • Format

    • pnpm run format:check: Check the formatting of the code.
    • pnpm run format: Format the code.
  • Lint

    • pnpm run lint:affected: Lint only the affected projects.
    • pnpm run lint:all: Lint all projects.
    • pnpm run lint:client: Lint the client application.
    • pnpm run lint:shadcnui: Lint the shadcnui library.
  • Precommit

    • pnpm run precommit: Run lint, type-check, build, and test for all projects before committing.
  • Prepare

    • pnpm run prepare: Prepare Husky for Git hooks.
  • Serve

    • pnpm run serve:client: Serve the client application.
    • pnpm run serve:storybook: Serve the Storybook instance.
  • Test

    • pnpm run test:affected: Test only the affected projects.
    • pnpm run test:all: Test all projects.
    • pnpm run test:client: Test the client application.
    • pnpm run test:shadcnui: Test the shadcnui library.
  • Type-check

    • pnpm run type-check:affected: Type-check only the affected projects.
    • pnpm run type-check:all: Type-check all projects.
    • pnpm run type-check:client: Type-check the client application.
    • pnpm run type-check:shadcnui: Type-check the shadcnui library.

Install Nx Console

Nx Console is an editor extension that enriches your developer experience. It lets you run tasks, generate code, and improves code autocompletion in your IDE. It is available for VSCode and IntelliJ.

Install Nx Console »

Projects

Current Projects

  • Erisfy: The main project integrating LLM and Agentic AI into a modern TypeScript-based web product.

Supporting Documentation

For detailed information on the projects, refer to the supporting documentation in the docs/specs directory:

Theming Your App

To learn how to theme your app using Shadcn UI and Tailwind CSS, please refer to the detailed guide in docs/theming-a-new-app.md.

Note: The current theme was generated using the Ready.js Shadcn UI Theme Generator.

Adding a New Component Page to the Routing in Your React SPA

To add a new component page to the routing in your React SPA, please refer to the detailed guide in docs/adding-new-component-page.md.

API Mocking for Frontend Development and GitHub Pages

The client application (apps/client) includes API mocking capabilities using Mock Service Worker (MSW). This feature is particularly important as it allows us to:

  1. Deploy and showcase the frontend on GitHub Pages without requiring a backend
  2. Develop and test frontend features independently of the backend
  3. Present working demos of the application for review before backend deployment
  4. Enable frontend developers to work without a local backend setup

Enabling/Disabling API Mocks in the Client

The API mocking configuration is specific to the client application located in apps/client. To enable or disable the mocks:

  1. Open the .env file in the apps/client directory

  2. Set the VITE_REACT_APP_USE_MOCKS environment variable:

    # Enable mocks for GitHub Pages deployment or local frontend-only development
    VITE_REACT_APP_USE_MOCKS=true
    
    # Disable mocks when working with the actual backend
    VITE_REACT_APP_USE_MOCKS=false

When mocks are enabled, the client application will intercept API requests and return mock data instead of attempting to communicate with the backend server. This is particularly useful for:

  • GitHub Pages deployments where no backend is available
  • Frontend development and testing
  • Creating demonstrations and previews
  • UI/UX reviews and demonstrations

Mock Implementation Details

The mock service implementation for the client is located in:

  • apps/client/src/mocks/ - Main mocking setup and handlers
  • apps/client/src/mocks/handlers/ - API endpoint mock implementations
  • apps/client/src/mocks/data/ - Mock data responses

To add or modify mock endpoints:

  1. Create or update handler files in apps/client/src/mocks/handlers/
  2. Define mock data in apps/client/src/mocks/data/
  3. Register handlers in apps/client/src/mocks/browser.ts

Example mock handler:

// apps/client/src/mocks/handlers/stockHandler.ts
rest.get('/api/stocks', (req, res, ctx) => {
  return res(ctx.status(200), ctx.json(mockStocksData));
});

When deploying to GitHub Pages, these mocks ensure that:

  • The application remains functional without a backend
  • Features can be demonstrated to stakeholders
  • UI/UX can be reviewed in a production-like environment
  • Frontend iterations can be quickly deployed and tested

Running Tests with MSW

First, ensure that the mock server is set up in the apps/client/src/mocks/server.ts file.

Next, import and start the mock server in your test setup file (apps/client/src/test/setup.ts):

import { server } from '../mocks/server';

beforeAll(() => server.listen());
afterEach(() => server.resetHandlers());
afterAll(() => server.close());

Finally, write your tests as usual, and MSW will intercept the API requests and return the mock responses.

Development Workflow with API Mocks

When working on frontend features, you can choose between using mocked or real API endpoints:

  1. Local Development with Mocks (No Backend Required):

    # apps/client/.env.development
    VITE_REACT_APP_USE_MOCKS=true

    Perfect for:

    • Initial UI development
    • Prototyping new features
    • Working offline
    • Frontend-only changes
  2. Full Stack Development (Backend Required):

    # apps/client/.env.development
    VITE_REACT_APP_USE_MOCKS=false

    Used when:

    • Testing real API integration
    • Developing backend features
    • Validating end-to-end functionality
  3. Production Deployment:

    # apps/client/.env.production
    VITE_REACT_APP_USE_MOCKS=false

    Always use real API endpoints in production.

  4. GitHub Pages Deployment:

    # apps/client/.env.production
    VITE_REACT_APP_USE_MOCKS=true

    Enables demonstration of frontend features without backend deployment.

Reference Documentation

For detailed steps on setting up MSW and integrating API mocking in Erisfy, refer to the Integrating API Mocking with MSW in Erisfy documentation.

Contributing

Contributions are welcome! Please open an issue or submit a pull request for any changes. For detailed guidelines on how to contribute, see Contributing.

License

This project is licensed under the MIT License.

Acknowledgments

Useful links

Learn more:

FAQs

Why doesn't the app load?

It's not finished yet.

Why is the button not working?

It's not finished yet.

Why is there no dark mode?

It's not finished yet.

Why does the page look weird on mobile?

It's not finished yet.

Why is the documentation incomplete?

It's not finished yet.

Why can't I find the feature I need?

It's not finished yet.

Why is the sky blue?

It's not finished yet. (Just kidding, that's actually due to Rayleigh scattering.)

Troubleshooting Setup

If you encounter issues during setup:

  1. Setup Script Fails:

    • Ensure Docker Desktop is running
    • Check that ports 3001 (server) and 5432 (database) are available
    • Verify your API keys are valid (see our API Key Setup Guide)
    • For API alternatives and self-hosted options, refer to our setup guide
  2. Manual Setup: If you prefer to set up manually or the script fails, follow these steps:

    a. Create apps/server/.env.development:

    PORT=3001
    NODE_ENV=development
    OPENAI_API_KEY=your_openai_api_key
    DATABASE_URL=postgresql://postgres:postgres@localhost:5432/erisfydb?schema=public
    POSTGRES_USER=postgres
    POSTGRES_PASSWORD=postgres
    POSTGRES_DB=erisfydb
    POSTGRES_PORT=5432
    POSTGRES_HOST=localhost

    b. Create apps/client/.env:

    VITE_REACT_APP_USE_MOCKS=false

    c. Start Docker and run migrations:

    pnpm run serve:docker
    nx run server:prisma-migrate
  3. Database Issues:

    • If migrations fail, try resetting the database:

      docker-compose down -v
      pnpm run serve:docker
      nx run server:prisma-migrate