Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: AI Chat UI #25

Merged
merged 12 commits into from
Dec 18, 2024
Merged

feat: AI Chat UI #25

merged 12 commits into from
Dec 18, 2024

Conversation

ChristopherChudzicki
Copy link
Collaborator

@ChristopherChudzicki ChristopherChudzicki commented Dec 16, 2024

What are the relevant tickets?

Closes https://github.com/mitodl/hq/issues/6238

Description (What does it do?)

Adds a component for talking to, and displaying messages from, an AI chat API.

Screenshots (if appropriate):

Screenshot 2024-12-16 at 1 03 00 PM

How can this be tested?

  1. Make sure you have the repo set it: clone it, yarn install, and yarn start.
  2. View the docs for the AI Chat component at http://localhost:6006/?path=/docs/smoot-design-aichat--docs ... they should make sense.
  3. Some other things to check in AiChat:
    • the scroll position should stay at bottom of chat window as new messages come in, unless the user has scrolled up
    • markdown is rendered, with links in our red text
    • Accessibility: If you have a screenreader (e.g., VoiceOver on MacOS) you could try the screenreader functionality:
      • when a new message is received from the AI, once it is fully loaded, it is read aloud.
      • While waiting for responses, you'll hear "loading", "still loading". This is customizable on the AiChat component..
  4. A few other components were added as building blocks that also have their own stories:

Additional Context

The AiChat screenreader functionality is modeled after how https://chatgpt.com/ behaves. It's not ideal—both the implementation here and https://chatgpt.com/ read the raw message content, which is markdown. So you'll hear stuff like "star star" and URLs get red aloud. This should be improved, but this is a decent start.


<Controls />

See <a href={gitLink("src/components/AiChat/types.ts")}>AiChat/types.ts</a> for all Typescript interface definitions.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There isn't a great way with storybook docs to embed Typescript interfaces (e.g., for complex props). A direct link to the types in github seems best compromise for now.

related:

@ChristopherChudzicki ChristopherChudzicki changed the title AI Chatbot UI feat: AI Chat UI Dec 16, 2024
Comment on lines 21 to 24
getAbsolutePath("@storybook/addon-links"),
getAbsolutePath("@storybook/addon-essentials"),
getAbsolutePath("@storybook/addon-interactions"),
getAbsolutePath("@storybook/addon-webpack5-compiler-swc"),
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

works around a bug in Storybook that occurs when you try and run it on Node 22.12.0, detailed here: nodejs/node#56127

Copy link

@jonkafton jonkafton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works really well. Good clean chat component. Not much to complain about here! Nice separation of the ScrollSnap and SrAnnouncer (can we rename this ScreenReaderAnnouncer?). Vercel's ai package seems a sensible choice (how did they secure that package name?!). Interested to see what server side support for native fetch ReadableStream will look like if we're intending to interface the LLMs via our APIs; is the plan to write a custom provider?

Two feature requests (reasonably out of scope for this PR):

  • A stop button will be handy (LLMs are chatty!). Often the send message button is replaced by a stop button while the reply is mid stream.

  • It would be good to also support starter messages (perhaps renamed suggestions) after each response, ie. the LLM is also prompted for new suggestions based on the most recent messages.

/**
* Tolerance within which scroll will be considered "at the bottom" of the element.
*/
threshold?: number

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This wasn't working for me as expected, ie. set a high number and it does not scroll when content is added while the scroll is near the bottom within the threshold.

It makes sense according to the description and the !atBottom condition - it will not scroll within the threshold as it is already considered at the bottom, though I interpret threshold as how far away from the bottom for it still to be treated as near enough to auto scroll.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be fixed now. I believe threshold was working fine initially, but not quite right if you updated its value.

@ChristopherChudzicki
Copy link
Collaborator Author

Interested to see what server side support for native fetch ReadableStream will look like if we're intending to interface the LLMs via our APIs; is the plan to write a custom provider?

As long as our chat API streams regular text, I don't think we'll need one, but worth reading up on.

@ChristopherChudzicki ChristopherChudzicki merged commit 8375c8a into main Dec 18, 2024
7 checks passed
@ChristopherChudzicki
Copy link
Collaborator Author

🎉 This PR is included in version 1.2.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Needs Review An open Pull Request that is ready for review released
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants