Skip to content

Commit

Permalink
Fix ChatQnA streaming response issue (#88)
Browse files Browse the repository at this point in the history
Signed-off-by: lvliang-intel <[email protected]>
  • Loading branch information
lvliang-intel authored Apr 18, 2024
1 parent 0ac6fd4 commit 9aa89ec
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion ChatQnA/langchain/docker/qna-app/app/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@ def stream_generator():
for text in router.llm_chain.stream({"question": query, "chat_history": router.chat_history}):
chat_response += text
processed_text = post_process_text(text)
if text is not None:
if text and processed_text:
yield processed_text
chat_response = chat_response.split("</s>")[0]
print(f"[rag - chat_stream] stream response: {chat_response}")
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/langchain/docker/qna-app/app/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -344,9 +344,9 @@ def reload_retriever(embeddings, index_name):
def post_process_text(text: str):
if text == " ":
return "data: @#$\n\n"
if text.isspace():
return None
if text == "\n":
return "data: <br/>\n\n"
if text.isspace():
return None
new_text = text.replace(" ", "@#$")
return f"data: {new_text}\n\n"

0 comments on commit 9aa89ec

Please sign in to comment.