-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Colab Hard Crashes (transformers maybe) #2982
Comments
Now when trying to run colab, it gets to "Start Stable Diffusion" cell and just stops. Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): Restarting the notebook should not cause it to crashes |
Part II
|
Part III
Thanks anyway Ben, appreciate the response =) Traceback (most recent call last): The above exception was the direct cause of the following exception: Traceback (most recent call last): |
Tried again today same results. |
Hello Ben & Community
Google Colab (Pro) Latest Version
So start up is fine but if i reload i get this and the notebook stops
(Now, for years, if i want to change model it crashes with a C, so i just reload)
But now i have to get a new T4 and start from scratch, if i want to change models =(
Here is the error code
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1146, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 83, in
from accelerate import version as accelerate_version
File "/usr/local/lib/python3.11/dist-packages/accelerate/init.py", line 16, in
from .accelerator import Accelerator
File "/usr/local/lib/python3.11/dist-packages/accelerate/accelerator.py", line 34, in
from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 13, in
initialize.imports()
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/initialize.py", line 17, in imports
import pytorch_lightning # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/init.py", line 34, in
from pytorch_lightning.callbacks import Callback # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/init.py", line 14, in
from pytorch_lightning.callbacks.callback import Callback
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/callback.py", line 25, in
from pytorch_lightning.utilities.types import STEP_OUTPUT
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/utilities/types.py", line 28, in
from torchmetrics import Metric
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/init.py", line 14, in
from torchmetrics import functional # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/init.py", line 77, in
from torchmetrics.functional.text.bleu import bleu_score
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/init.py", line 30, in
from torchmetrics.functional.text.bert import bert_score # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/bert.py", line 24, in
from torchmetrics.functional.text.helper_embedding_metric import (
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/helper_embedding_metric.py", line 26, in
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1136, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1148, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)
AND
once upon a time i could load the model i want into Colab and start with it. (Now i have to load it with Stable XL model each time.)
What happened to that?
Anyone know?
Anyhow thanks all
The text was updated successfully, but these errors were encountered: