-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Nicolay Rusnachenko edited this page Dec 26, 2024
·
7 revisions
A lightweight, no-strings-attached framework for your LLM that allows applying Chain-of-Thought-alike prompt schema
(See related section) towards a massive textual collections.
This is a simple example of API usage:
from bulk_chain.core.utils import dynamic_init
from bulk_chain.api import iter_content
llm = dynamic_init(class_dir=join("."),
class_filepath="ext/replicate.py",
class_name="Replicate")(api_token="<API-KEY>")
content_it = iter_content(
# 1. Your iterator of dictionaries
data_it,
# 2. Your third-party model implementation.
llm=llm,
# 3. Your schema.
schema="ext/schema/default.json")
for content in content_it:
# Handle your LLM responses here ...
You can enable batching support via the support_batching
parameter.
class YourLLM(BaseLM):
def __init__(self, **kwargs):
super(YourLLM, self).__init__(support_batching=True, **kwargs)
# Any further initialization.