Simplify and structure user feedback about VPN sign-up experiences with this package.
VPNFeedbacker takes text input from users describing their sign-up process with a VPN service and returns a structured output categorizing the feedback into key aspects like simplicity, speed, and user satisfaction.
pip install vpnfeedbackerfrom vpnfeedbacker import vpnfeedbacker
response = vpnfeedbacker(user_input)Or with a custom LLModel instance (based on langchain):
from langchain_openai import ChatOpenAI
from vpnfeedbacker import vpnfeedbacker
llm = ChatOpenAI()
response = vpnfeedbacker(user_input, llm=llm)user_input:str, the user input text to processllm:Optional[BaseChatModel], the langchain LLm instance to use. If not provided, the defaultChatLLM7will be used.api_key:Optional[str], the API key for LLM7. If not provided, it defaults toNoneor the value of theLLM7_API_KEYenvironment variable.
By default, the package uses the ChatLLM7 instance from langchain_llm7.
You can use other LLMs from langchain by passing your own LLm instance, e.g. OpenAI, Anthropic, Google Generative AI:
from langchain_openai import ChatOpenAI
from vpnfeedbacker import vpnfeedbacker
llm = ChatOpenAI()
response = vpnfeedbacker(user_input, llm=llm)or
from langchain_anthropic import ChatAnthropic
from vpnfeedbacker import vpnfeedbacker
llm = ChatAnthropic()
response = vpnfeedbacker(user_input, llm=llm)or
from langchain_google_genai import ChatGoogleGenerativeAI
from vpnfeedbacker import vpnfeedbacker
llm = ChatGoogleGenerativeAI()
response = vpnfeedbacker(user_input, llm=llm)The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you need higher rate limits, you can pass your own API key via environment variable LLM7_API_KEY or directly like vpnfeedbacker(user_input, api_key="your_api_key"). Get a free API key at https://token.llm7.io/.
Open issues and pull requests are welcome at https://github.com/chigwell/vpnfeedbacker.
Eugene Evstafev hi@eugene.plus