Skip to content

A new package would use llmatch-messages to process text snippets about infrastructure or technical incidents and return a structured summary. It would take a user's text input, pass it to an LLM with

Notifications You must be signed in to change notification settings

chigwell/incident-summary-parser

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Incident Summary Parser Package

PyPI version License: MIT Downloads LinkedIn

This Python package provides a simple and reliable way to analyze and extract structured incident summaries from unstructured text reports or news snippets. It leverages large language models (LLMs) to interpret incident descriptions and outputs standardized information, making it easy to integrate incident analysis into your workflow or applications.

Installation

Install the package via pip:

pip install incident_summary_parser

Usage

Import and utilize the incident_summary_parser function as follows:

from incident_summary_parser import incident_summary_parser

response = incident_summary_parser(
    user_input="Your incident report text here",
    api_key="your_llm7_api_key",  # optional if set via environment variable
    llm=None  # optional, can pass your own LLM instance
)
print(response)

Parameters

  • user_input (str): The incident report or news snippet you want to analyze.

  • llm (Optional[BaseChatModel]): An optional language model instance conforming to langchain's interface. If not provided, the function defaults to using ChatLLM7.

  • api_key (Optional[str]): Your API key for the LLM7 service. Can also be set via the environment variable LLM7_API_KEY.

Supporting Custom LLMs

You can pass your own LLM implementations, such as OpenAI, Anthropic, or Google's Generative AI, to the function for flexibility, for example:

from langchain_openai import ChatOpenAI
from incident_summary_parser import incident_summary_parser

llm = ChatOpenAI()
response = incident_summary_parser(
    user_input="Sample incident report",
    llm=llm
)

Or with other providers:

from langchain_anthropic import ChatAnthropic
from incident_summary_parser import incident_summary_parser

llm = ChatAnthropic()
response = incident_summary_parser(
    user_input="Sample incident report",
    llm=llm
)

Notes

  • The package uses ChatLLM7 from langchain_llm7 (see PyPI) by default.
  • Default rate limits are suitable for most use cases, but you can increase limits by obtaining your own API key.
  • You can register for a free API key at https://token.llm7.io/.

Contributing

Please report issues or contribute improvements via the GitHub repository: https://github.....

Author