A Python package designed to analyze and structure user-submitted text, specifically focusing on community feedback and moderation. This tool leverages the capabilities of llmatch-messages to process and extract meaningful insights from user inputs, such as forum posts, comments, or feedback forms. By using pattern matching and retry logic, the package ensures that the extracted data is consistent and formatted correctly, making it easier for moderators to review and respond to user feedback.
- Pattern Matching: Extracts structured data from unstructured user inputs.
- Retry Logic: Ensures consistent and reliable data extraction.
- Flexible LLM Integration: Supports various LLM providers, including LLM7, OpenAI, Anthropic, and Google.
- Easy Integration: Simple API for seamless integration into existing workflows.
pip install feedback_analyzer_modfrom feedback_analyzer_mod import feedback_analyzer_mod
user_input = "Your user input text here"
response = feedback_analyzer_mod(user_input)
print(response)from langchain_openai import ChatOpenAI
from feedback_analyzer_mod import feedback_analyzer_mod
llm = ChatOpenAI()
response = feedback_analyzer_mod(user_input, llm=llm)
print(response)from langchain_anthropic import ChatAnthropic
from feedback_analyzer_mod import feedback_analyzer_mod
llm = ChatAnthropic()
response = feedback_analyzer_mod(user_input, llm=llm)
print(response)from langchain_google_genai import ChatGoogleGenerativeAI
from feedback_analyzer_mod import feedback_analyzer_mod
llm = ChatGoogleGenerativeAI()
response = feedback_analyzer_mod(user_input, llm=llm)
print(response)- user_input (str): The user input text to process.
- llm (Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the default
ChatLLM7will be used. - api_key (Optional[str]): The API key for LLM7. If not provided, the environment variable
LLM7_API_KEYwill be used.
The package uses ChatLLM7 from langchain_llm7 by default. You can get a free API key by registering at LLM7.
The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you want higher rate limits, you can pass your own API key via the environment variable LLM7_API_KEY or directly via the api_key parameter.
- Eugene Evstafev
- Email: hi@eugene.plus
- GitHub: chigwell
For any issues or suggestions, please open an issue on GitHub.