Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading SeaLLM-hybrid-7b #5

Open
rinabuoy opened this issue Jan 5, 2024 · 2 comments
Open

Loading SeaLLM-hybrid-7b #5

rinabuoy opened this issue Jan 5, 2024 · 2 comments

Comments

@rinabuoy
Copy link

rinabuoy commented Jan 5, 2024

Hello there,

Thanks for sharing your remarkable works of SeaLLMs - Large Language Models for Southeast Asia.

My name is Rina and I'm from Cambodia. I have been playing around with your 7b chat model. I have managed to load it using the Llama 2 code base.

However, when I tried to load the SeaLLM-hybrid-7b, the following error was raised.

image

@nxphi47
Copy link
Contributor

nxphi47 commented Jan 5, 2024

Hi, thanks for your interest. That's because the hybrid files are missing the metadata['format'] = 'pt' attributes. I will update it soon. In the mean time, you have to download the safetensors files and overwrite them with the solution here

import safetensors
from safetensors.torch import save_file

tensors = dict()
with safetensors.safe_open(safetensors_path, framework="pt") as f:
    for key in f.keys():
        tensors[key] = f.get_tensor(key)

save_file(tensors, safetensors_path, metadata={'format': 'pt'})

@rinabuoy
Copy link
Author

rinabuoy commented Jan 6, 2024

Thanks very much ! The proposed fix worked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants