Skip to content

A simple Spring Boot demo showcasing 5 distinct implementations of the LangChain4j ChatModel interface, covering ChatMessage, ChatRequest, ChatResponse, Parameters, and Metadata inspection.

Notifications You must be signed in to change notification settings

BootcampToProd/langchain4j-chatmodel-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

4 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿค– LangChain4j ChatModel: Deep Dive into AI Interactions

This repository demonstrates the full capabilities of the LangChain4j ChatModel interface in LangChain4j. It explores the core mechanics of AI interaction, demonstrating how to construct complex requests, manage conversation context, configure model parameters, and analyze rich response metadata.

๐Ÿ“– Complete Guide: For detailed explanations and a full code walkthrough, read our comprehensive tutorial.
๐Ÿ‘‰ LangChain4j ChatModel: A Complete Beginnerโ€™s Guide

๐ŸŽฅ Video Tutorial: Prefer hands-on learning? Watch our step-by-step implementation guide.
๐Ÿ‘‰ YouTube Tutorial - LangChain4j ChatModel: The Complete Guide to Requests, Responses & Parameters

LangChain4j ChatModel: The Complete Guide to Requests, Responses & Parameters

โ–ถ๏ธ Watch on YouTube


โœจ What This Project Demonstrates

This application serves as a deep dive into the ChatModel API, covering the full lifecycle of an AI request:

  • Message Management - Understanding the roles of SystemMessage, UserMessage, and AiMessage to create context-aware personas.
  • Request Configuration - Using ChatRequest and ChatRequestParameters to configure model behavior (Temperature, Max Tokens, Stop Sequences).
  • Contextual Conversations - Managing conversation history to enable back-and-forth dialogue logic.
  • Response Analysis - Extracting critical metadata from ChatResponse, including TokenUsage and FinishReason.

๐Ÿ› ๏ธ Prerequisites

To run this application, you will need the following:

  1. OpenRouter API Key: This project uses OpenRouter to access free AI models (DeepSeek, Llama, etc.) via OpenAI-compatible endpoints.
  2. Setup Environment Variables: Set your API key as an environment variable:
# Set your OpenRouter API Key
export OPENROUTER_API_KEY=your_api_key_here

๐Ÿš€ How to Run and Test

For detailed instructions on how to set up, configure, and test the application, kindly go through our comprehensive article:
๐Ÿ‘‰ Click here for Setup & Testing Instructions