Skip to content
Nicolay Rusnachenko edited this page Dec 26, 2024 · 7 revisions

twitter PyPI downloads

A lightweight, no-strings-attached framework for your LLM that allows applying Chain-of-Thought-alike prompt schema (See related section) towards a massive textual collections.

Contents

API

This is a simple example of API usage:

from bulk_chain.core.utils import dynamic_init
from bulk_chain.api import iter_content

llm = dynamic_init(class_dir=join("."),
                   class_filepath="ext/replicate.py",
                   class_name="Replicate")(api_token="<API-KEY>")

content_it = iter_content(
    # 1. Your iterator of dictionaries
    data_it,
    # 2. Your third-party model implementation.
    llm=llm,
    # 3. Your schema.              
    schema="ext/schema/default.json")
    
for content in content_it:
    # Handle your LLM responses here ...

Batching

You can enable batching support via the support_batching parameter.

class YourLLM(BaseLM):

    def __init__(self, **kwargs):
        super(YourLLM, self).__init__(support_batching=True, **kwargs)
        # Any further initialization.
Clone this wiki locally