A professional stock analysis platform with ML-powered insights, beautiful visualizations, and comprehensive trading tools.
- π Real-time Stock Analysis - Live data with technical indicators
- π¨ Modern UI - Beautiful, professional Streamlit dashboard
- π Smart Alerts - Automatic detection of trading signals
- π Backtesting - Test strategies on historical data
β οΈ Risk Management - VaR, CVaR, drawdown analysis- π₯ Data Export - CSV, JSON, Excel, Parquet formats
- π Interactive Charts - Candlestick, indicators, volume analysis
# Clone the repository
git clone <repo-url> apex-analysis
cd apex-analysis
# Install dependencies
pip install -r requirements.txt
# Launch the Streamlit dashboard
streamlit run streamlit_app.pyOr simply double-click run_app.bat on Windows!
The dashboard will open at http://localhost:8501
# Run the interactive CLI
python main.pyType a ticker symbol (e.g., AAPL, GOOGL) and press Enter.
The modern web dashboard (streamlit_app.py) provides a professional, interactive interface:
- Watchlist Overview: Real-time metrics for your favorite stocks
- Quick Stats: Price, change %, and trend indicators
- Analysis History: Track your recent analyses
- Beautiful Charts: Interactive candlestick and technical indicator charts
- Smart Alerts: Automatic detection of:
- π Moving average crossovers (bullish/bearish)
- π RSI overbought/oversold conditions
- π’ Volume spikes
- π― 52-week high/low proximity
- π MACD crossovers
- π° Significant price movements
- Technical Indicators: RSI, MACD, MA20, MA50, Bollinger Bands
- Key Metrics: Live price, volume, 52-week ranges
- Strategy Testing: MA Crossover, RSI Mean Reversion, MACD Momentum
- Performance Metrics:
- Total & Annualized Returns
- Sharpe & Sortino Ratios
- Maximum Drawdown
- Win Rate & Total Trades
- Equity Curve: Visual representation of portfolio value over time
- Value at Risk (VaR): 95% and 99% confidence levels
- Conditional VaR (CVaR): Expected shortfall analysis
- Drawdown Analysis: Visual drawdown chart
- Returns Distribution: Histogram of daily returns
- Risk-Adjusted Returns: Sharpe and Sortino ratios
- Automatic Alerts: Generated during stock analysis
- Custom Alerts: Set your own price, change, or volume thresholds
- Alert Management: View and remove active alerts
- Multiple Formats: CSV, JSON, Excel (.xlsx), Parquet
- Flexible Options: Include/exclude technical indicators
- Data Preview: See your data before downloading
- One-Click Download: Export with timestamp and ticker
See DEPLOYMENT.md for detailed instructions on:
- Deploying to Streamlit Cloud (FREE)
- Running locally
- Configuration options
- Troubleshooting
Reports are saved under the repository root reports/<TICKER>/. The following file types are produced:
- CSV: price and sentiment data (e.g.
NVDA_price_data_YYYYMMDD_HHMMSS.csv) - JSON: structured exports and summaries (e.g.
NVDA_summary_...json) - PNG: visualizations (e.g.
NVDA_NVDA_analysis_...png,NVDA_NVDA_sentiment_...png)
Example path inside this repo: reports/NVDA/NVDA_price_data_2025...csv.
Below is the complete project structure (top-level and src/):
/ (repo root)
βββ LICENSE
βββ README.md
βββ main.py
βββ pyproject.toml
βββ requirements.txt
βββ setup.py
βββ __init__.py
βββ apex_analysis.egg-info/
βββ reports/ # generated output (created at runtime)
βββ cache/ # runtime cache (may be empty)
βββ test_aggregate.py # small helper test (dev)
βββ test_save_png.py # small helper test (dev)
βββ src/
βββ __init__.py
βββ __main__.py
βββ aggregator.py # orchestrates fetching, analysis & report saving
βββ config.py # central configuration (paths, plot settings)
βββ fetch_data.py # price/history fetching (yfinance)
βββ news_processor.py # RSS fetcher & article scraper
βββ sentiment_analyzer.py # sentiment scoring, VADER/TextBlob
βββ ui.py # command-line interface + plot generation
βββ utils.py # helper functions (save_plot, save_dataframe, logging)
βββ reports/ # legacy folder inside src (not used; canonical is repo-root/reports)
Files of interest and where to find them
src/config.pyβ central location forREPORTS_DIR,PLOT_DPI,SAVE_PLOTS, and other settings.src/aggregator.pyβ main logic that fetches price and news, runs sentiment analysis and writes JSON/CSV results. It now attaches pandas DataFrames (price_history,sentiment_data) to the returned result soui.generate_report()can create PNGs.src/ui.pyβ builds matplotlib figures and saves PNGs toreports/<TICKER>/(calls internal helpers but could be refactored to useutils.save_plot()centrally).src/utils.pyβ central helper functions for file I/O and logging; usesREPORTS_DIRfromsrc/config.pyto ensure all files are written to the same place.src/fetch_data.pyβ handles yfinance calls and prepares thehistoryDataFrame used for plotting.src/news_processor.pyβ fetches RSS feeds and scrapes article content when allowed (respects robots.txt by default).src/sentiment_analyzer.pyβ analyzes article text using NLTK VADER and TextBlob and returns scores used in plots and reports.
- User enters a ticker in the CLI (
ui.run_cli). aggregator.aggregate_analysis()fetches price data viafetch_data, fetches news vianews_processor, then runssentiment_analyzer.batch_analyze().- Aggregator saves CSV/JSON outputs to
reports/<TICKER>/and attaches DataFrames to its return value. ui.generate_report()builds plots fromprice_historyandsentiment_data(if present) and saves PNGs to the same directory.
Open src/config.py to change behavior. Important options:
REPORTS_DIRβ path where reports are written (default: repo rootreports/).PLOT_DPI,PLOT_FIGSIZE,PLOT_STYLEβ Matplotlib settings for saved figures.SAVE_PLOTSβ whether to save PNGs (if False, PNG saving will be skipped where respected).RESPECT_ROBOTSβ respectrobots.txtwhen scraping articles (recommended True).
To change the reports path, edit REPORTS_DIR in src/config.py. The code will create the directory automatically.
- The REST API generates local users (admin/demo) on first run and stores them under
data/users.json. - The entire
data/directory is ignored by git so API keys and refresh tokens never leave your machine. - If you need to reset credentials, delete
data/users.jsonand restart the web app to recreate fresh keys.
Generate a single analysis run for NVDA and exit:
printf "NVDA\nquit\n" | python3 main.pyRun the aggregator directly from a script (useful for automation):
from src.aggregator import aggregate_analysis
res = aggregate_analysis('NVDA')
print(res['saved_files'])There are two small helper scripts used during development:
test_save_png.pyβ small script that usessrc.utils.save_plot()to verify PNG saving.test_aggregate.pyβ runsaggregate_analysis('TEST')to exercise the main flow.
These are convenience scripts and not full unit tests. If you want, we can convert them to pytest tests and add a CI workflow.
-
If nothing appears in
reports/<TICKER>/:- Verify
src/config.py:REPORTS_DIRpoints to the expected path. - Check log messages (stdout/stderr) for exceptions printed by
src.utils.logger. - Ensure dependencies are installed (
pip install -r requirements.txt).
- Verify
-
If PNGs are not produced but CSV/JSON are:
- Make sure
aggregate_analysisreturnedprice_historyand/orsentiment_dataDataFrames (these are required for plotting). - Check
src/ui.pyto confirmgenerate_report()is being called.
- Make sure
- Fork the repo and create a branch for your changes.
- Run and verify the app locally.
- Open a PR with a clear description and any relevant screenshots or logs.
If you'd like, I can add:
- a pytest-based test that asserts PNG files are created for a known ticker (smoke test),
- a GitHub Actions workflow to run basic tests on push,
- or unify plot saving to use
src.utils.save_plot()only (less duplication).
This project is distributed under the MIT License β see LICENSE for details.
If you have questions or want help extending the project, open an issue or PR in the repository.
This project was originally designed, completed, and tested by Richard Zhu and was later co-edited by Ishaan Manoor, Tanish Patel, and Dennis Talpa.
This repository is maintained under the MeridianAlgo initiative. MeridianAlgo focuses on algorithmic research and applied data science in finance and related fields. For inquiries, collaborations, or support, contact:
- Email: meridianaglo@gmail.com
- Website: https://meridianalgo.org
MeridianAlgo provides educational resources and prototypes; this project is provided for educational and research use.
- See
CONTRIBUTORS.mdfor a full list of contributors, contribution guidelines, and acknowledgements. - See
SECURITY.mdfor the project's security and vulnerability disclosure policy.
Both files are included at the repository root and linked here for quick access.
Below are direct links to the key source files used by this project. Click the filename to jump to the file in the repository viewer.
- main.py β application entry point
- src/config.py β central config and paths
- src/aggregator.py β coordinates data fetching, analysis, and saving
- src/fetch_data.py β price/history fetching (yfinance)
- src/news_processor.py β RSS and article scraping
- src/sentiment_analyzer.py β sentiment scoring logic
- src/ui.py β CLI UI and plotting
- src/utils.py β helper functions for saving/loading and logging
- CONTRIBUTORS.md β contributor list & contribution guidelines
- SECURITY.md β vulnerability disclosure policy