Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increased Anonymization #5

Open
comath opened this issue Sep 5, 2023 · 0 comments
Open

Increased Anonymization #5

comath opened this issue Sep 5, 2023 · 0 comments

Comments

@comath
Copy link
Contributor

comath commented Sep 5, 2023

Idea

We can create a /chat/random endpoint that generates from a random model.

Complete Anon

Ongoing conversations would need a token that is stored in the redis KV store. Once you start a conversation with a particular model you should get a conversation UUID that is unique to that conversation. On subsequent requests the LLM Router should look up the model associated to that converation and use it to generate the next parts of the conversation.

This would put all model anonymization at the router component and would reduce the risk of a database leak in CTFd.

It would also require more persistent storage of the data in the LLM Router pod so that the event dataset can be reconstructed with the CTFd data.

Incomplete Anon

The returned value would include the name of the model so that subsequent generations can be generated with the /chat/generate endpoint.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant