Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

google-generativeai package does not work in Mesop + Colab #1183

Closed
williamito opened this issue Jan 16, 2025 · 1 comment
Closed

google-generativeai package does not work in Mesop + Colab #1183

williamito opened this issue Jan 16, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@williamito
Copy link
Contributor

williamito commented Jan 16, 2025

Describe the bug
When using the google-generativeai SDK within a mel.chat() transform function, the UI gets stuck and the response never resolves. There are a bunch of errors in the browser Console. The same code works as expected in a normal code cell. I'm told it also works in Mesop running directly (not in Colab) but did not try to reproduce myself.

Interestingly, the equivalent code from the google-genai SDK does work.

Similar failure using both the generate_content() and start_chat() versions.

To Reproduce

This fails:

!pip install mesop
!pip install google-generativeai

import google.generativeai as genai
genai.configure(api_key="API_KEY")
model = genai.GenerativeModel('gemini-2.0-flash-exp')
chat_session = model.start_chat()

import mesop as me
import mesop.labs as mel
me.colab_run()

@me.page(path="/chat")
def page():
  mel.chat(transform)

def transform(prompt: str, _: list[mel.ChatMessage]) -> str:
    response = chat_session.send_message(prompt)
    return response.text

me.colab_show(path="/chat", height='800')

This works:

!pip install mesop
!pip install google-genai

from google import genai
client = genai.Client(api_key="API_KEY")
chat_session = client.chats.create(model='gemini-2.0-flash-exp')

import mesop as me
import mesop.labs as mel
me.colab_run()

@me.page(path="/chat")
def page():
  mel.chat(transform)

def transform(prompt: str, _: list[mel.ChatMessage]) -> str:
    response = chat_session.send_message(prompt)
    return response.text

me.colab_show(path="/chat", height='800')
@williamito williamito added the bug Something isn't working label Jan 16, 2025
@richard-to
Copy link
Collaborator

richard-to commented Jan 17, 2025

I think this is related to this issue: #921

Seems like something on the colab side that we can't fix on our side. So it may be better to just use the google-genai lib since that one does work as you mentioned.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants