A new package that analyzes user-provided text summaries of yearly breakdowns (e.g., financial reports, project post-mortems, or performance reviews) and extracts structured insights. It uses an LLM to identify key themes, recurring issues, successes, and recommendations, then formats the output into a consistent, machine-readable structure. This helps users quickly digest and act on summarized yearly data without manual parsing.
pip install yearly_insights_parserfrom yearly_insights_parser import yearly_insights_parser
response = yearly_insights_parser(
user_input="...", # user-provided text input
api_key="your_api_key", # optional, use LLM7 API key for higher rate limits
llm=your_llm_instance, # optional, use a custom LLM instance
)The yearly_insights_parser function takes three parameters:
user_input: the text input to process (string)api_key: optional, use an LLM7 API key for higher rate limits (string)llm: optional, use a custom LLM instance (BaseChatModel instance)
By default, it uses the ChatLLM7 from langchain_llm7 <https://pypi.org/project/langchain-llm7/>_. If you want to use another LLM, you can pass your own instance:
from langchain_openai import ChatOpenAI
from yearly_insights_parser import yearly_insights_parser
llm = ChatOpenAI()
response = yearly_insights_parser(user_input="...", llm=llm)Similarly, you can use ChatAnthropic or ChatGoogleGenerativeAI from langchain_anthropic <https://pypi.org/project/langchain-anthropic/>_ or langchain_google_genai <https://pypi.org/project/langchain-google-genai/>_ respectively.
Note that the default rate limits for LLM7 free tier should be sufficient for most use cases. If you need higher rate limits, you can provide your own API key using one of the above methods.
The default rate limits for the LLM7 free tier are generally sufficient for most use cases. If you require higher rate limits for LLM7, you can provide your own API key:
- Via the
LLM7_API_KEYenvironment variable. - Directly by passing the
api_keyargument to the function:yearly_insights_parser(user_input="...", api_key="your_api_key")You can obtain a free API key by registering athttps://token.llm7.io/
Please report any issues or suggestions on the GitHub issues page: https://github.com/chigwell/yearly-insights-parser/
- Eugene Evstafev (hi@eugene.plus)
[MIT License] - see the LICENSE.md file for details.