Extract structured, concise summaries from news headlines and brief texts using pattern matching and LLM interactions.
This package helps researchers, journalists, and analysts quickly categorize and understand complex domain-specific issues (e.g., environmental crises, economic policies, geopolitics) by summarizing raw textual snippets. It avoids processing full documents, multimedia, or URLs, focusing on rapid, structured insights for decision-making or reporting.
- Pattern-based extraction for structured summaries
- LLM-powered summarization with configurable models
- Lightweight – works with short text snippets
- Flexible LLM integration (supports OpenAI, Anthropic, Google, etc.)
- Default LLM7 integration (free tier sufficient for most use cases)
pip install text_snippet_summarizerfrom text_snippet_summarizer import text_snippet_summarizer
user_input = "Climate change impacts: rising temperatures, extreme weather events, and biodiversity loss."
response = text_snippet_summarizer(user_input)
print(response)from langchain_openai import ChatOpenAI
from text_snippet_summarizer import text_snippet_summarizer
llm = ChatOpenAI()
response = text_snippet_summarizer(user_input, llm=llm)from langchain_anthropic import ChatAnthropic
from text_snippet_summarizer import text_snippet_summarizer
llm = ChatAnthropic()
response = text_snippet_summarizer(user_input, llm=llm)from langchain_google_genai import ChatGoogleGenerativeAI
from text_snippet_summarizer import text_snippet_summarizer
llm = ChatGoogleGenerativeAI()
response = text_snippet_summarizer(user_input, llm=llm)- Default: Uses
LLM7_API_KEYfrom environment variables. - Manual override: Pass
api_keydirectly:response = text_snippet_summarizer(user_input, api_key="your_api_key_here")
- Get a free LLM7 API key: Register here
text_snippet_summarizer(
user_input: str,
api_key: Optional[str] = None,
llm: Optional[BaseChatModel] = None
) -> List[str]user_input: Raw text snippet to summarize.api_key(optional): LLM7 API key (defaults toLLM7_API_KEYenv var).llm(optional): Custom LangChain LLM (e.g.,ChatOpenAI,ChatAnthropic).
- Rate Limits: LLM7 free tier is sufficient for most use cases.
- Output: Returns a list of structured summary points matching predefined patterns.
- Dependencies: Uses
langchain_llm7(default) or anyBaseChatModelfrom LangChain.
Report bugs or feature requests: 🔗 GitHub Issues
Eugene Evstafev (@chigwell) 📧 hi@euegne.plus