Extract key insights, definitions, and actionable takeaways from any text input using a large language model.
pip install cogni_summaryfrom cogni_summary import cogni_summary
response = cogni_summary(user_input="Your text here")You can pass additional parameters to customize the model:
user_input: The text to be summarized (required).llm: The LangChain LLM instance to use. If not provided, the default ChatLLM7 will be used.api_key: The API key for LLM7. If not provided, the default rate limits for LLM7 free tier will be used.
You can safely pass your own LLM instance (e.g., OpenAI, Anthropic, Google) using the corresponding LangChain library. For example, to use the OpenAI LLM:
from langchain_openai import ChatOpenAI
from cogni_summary import cogni_summary
llm = ChatOpenAI()
response = cogni_summary(user_input, llm=llm)Alternatively, to use the Anthropic LLM:
from langchain_anthropic import ChatAnthropic
from cogni_summary import cogni_summary
llm = ChatAnthropic()
response = cogni_summary(user_input, llm=llm)Or to use the Google LLM:
from langchain_google_genai import ChatGoogleGenerativeAI
from cogni_summary import cogni_summary
llm = ChatGoogleGenerativeAI()
response = cogni_summary(user_input, llm=llm)If you need higher rate limits for LLM7, you can pass your own API key as an environment variable LLM7_API_KEY or directly as the api_key parameter.
Get a free API key by registering at https://token.llm7.io/.
For more information about the LangChain libraries used in this package, please refer to:
- LangChain LLM7 documentation
- LangChain OpenAI documentation
- LangChain Anthropic documentation
- LangChain Google GenAI documentation
Please submit any issues or pull requests to our GitHub repository: https://github.com/chigwell/cogni-summary
Eugene Evstafev (hi@eugene.plus)