Skip to content

Commit

Permalink
Update README.md (#47)
Browse files Browse the repository at this point in the history
* Update README.md

* docs: Update README with comprehensive LLM provider configuration guide

---------

Co-authored-by: Nikita Ruban <[email protected]>
  • Loading branch information
guo and nicky-ru authored Jan 28, 2025
1 parent cedde92 commit 0c1663d
Showing 1 changed file with 37 additions and 1 deletion.
38 changes: 37 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -140,10 +140,46 @@ Quicksilver is just getting started, and there’s immense potential for growth.

Have an idea outside of this list? We’d love to hear it!

## Work with Eliza
## Quicksilver works with Eliza

Quicksilver is serving the sentient AI queries as the DePIN-Plugin on [Eliza](https://github.com/elizaOS/eliza). You can simply enable the plugin and start using it. With Quicksilver, your Eliza agent will gain sentient-like capabilities to interact intelligently with the world. The current capabilities are listed above. If you like to add more capabilities, please refer to the [Contributing](#contributing) section.

## LLM Provider Configuration

Quicksilver supports multiple LLM providers and uses a dual-LLM architecture with a fast LLM for initial processing and a primary LLM for complex reasoning. Configure your providers in the `.env` file:

```env
# LLM Provider API Keys
OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
DEEPSEEK_API_KEY=your_deepseek_api_key
# LLM Model Selection
FAST_LLM_MODEL=gpt-4o-mini # Model for fast processing
LLM_MODEL=deepseek-chat # Model for primary reasoning
```

### Supported Providers

- **OpenAI**: Use OpenAI's models by setting the provider to `openai`
- Default model: `gpt-4o-mini`

- **Anthropic**: Use Claude models by setting the provider to `anthropic`
- Default model: `claude-3-5-haiku-latest`

- **DeepSeek**: Use DeepSeek's models by setting the provider to `deepseek`
- Default model: `deepseek-chat`
- Note: DeepSeek uses OpenAI-compatible API endpoints

You can configure both the fast LLM and primary LLM providers in the SentientAI initialization:

```typescript
new SentientAI({
fastLLMProvider: "openai", // For quick processing
llmProvider: "deepseek" // For main reasoning
});
```

## Contributing

We welcome contributions! To contribute:
Expand Down

0 comments on commit 0c1663d

Please sign in to comment.