Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error reporting when using local LLM, #52

Open
anytimehh opened this issue May 19, 2024 · 2 comments
Open

Error reporting when using local LLM, #52

anytimehh opened this issue May 19, 2024 · 2 comments

Comments

@anytimehh
Copy link

cat .env
LLM_NAME="Ollama"
OLLAMA_MODEL_NAME="qwen:7b"
OLLAMA_BASE_URL="http://192.168.2.205:11434"
MIN_RELEVANCE_SCORE=0.3
BOT_TOPIC="OpenIM"
URL_PREFIX="http://192.168.2.205:11434"
USE_PREPROCESS_QUERY=0
USE_RERANKING=1
USE_DEBUG=1

Error reporting:

192.168.2.205 - - [19/May/2024 01:08:06] "POST /open_kf_api/queries/smart_query_stream HTTP/1.1" 500 -
Error on request:
Traceback (most recent call last):
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/werkzeug/serving.py", line 362, in run_wsgi
execute(self.server.app)
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/werkzeug/serving.py", line 325, in execute
for data in application_iter:
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/werkzeug/wsgi.py", line 256, in next
return self._next()
^^^^^^^^^^^^
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/werkzeug/wrappers/response.py", line 32, in _iter_encoded
for item in iterable:
File "/root/rag-gpt/server/app/queries.py", line 437, in generate_llm
response = generate_answer(query, user_id, True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/rag-gpt/server/app/queries.py", line 305, in generate_answer
response = llm_generator.generate(prompt, is_streaming)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/rag-gpt/server/rag/generation/llm.py", line 47, in generate
response = self.client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 663, in create
return self._post(
^^^^^^^^^^^
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1200, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 889, in request
return self._request(
^^^^^^^^^^^^^^
File "/root/rag-gpt/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 980, in _request
raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: 404 page not found

@anytimehh
Copy link
Author

I found the reason, it's because I modified the address in the code before, and it just needs to be changed back.

@blmdxiao
Copy link
Contributor

I will refine the deployment documentation later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants