Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Colab Hard Crashes (transformers maybe) #2982

Open
Azura-13 opened this issue Jan 21, 2025 · 6 comments
Open

Colab Hard Crashes (transformers maybe) #2982

Azura-13 opened this issue Jan 21, 2025 · 6 comments

Comments

@Azura-13
Copy link

Hello Ben & Community

Google Colab (Pro) Latest Version
So start up is fine but if i reload i get this and the notebook stops
(Now, for years, if i want to change model it crashes with a C, so i just reload)
But now i have to get a new T4 and start from scratch, if i want to change models =(

Here is the error code

Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1146, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 83, in
from accelerate import version as accelerate_version
File "/usr/local/lib/python3.11/dist-packages/accelerate/init.py", line 16, in
from .accelerator import Accelerator
File "/usr/local/lib/python3.11/dist-packages/accelerate/accelerator.py", line 34, in
from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 13, in
initialize.imports()
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/initialize.py", line 17, in imports
import pytorch_lightning # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/init.py", line 34, in
from pytorch_lightning.callbacks import Callback # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/init.py", line 14, in
from pytorch_lightning.callbacks.callback import Callback
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/callback.py", line 25, in
from pytorch_lightning.utilities.types import STEP_OUTPUT
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/utilities/types.py", line 28, in
from torchmetrics import Metric
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/init.py", line 14, in
from torchmetrics import functional # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/init.py", line 77, in
from torchmetrics.functional.text.bleu import bleu_score
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/init.py", line 30, in
from torchmetrics.functional.text.bert import bert_score # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/bert.py", line 24, in
from torchmetrics.functional.text.helper_embedding_metric import (
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/helper_embedding_metric.py", line 26, in
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1136, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1148, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

AND
once upon a time i could load the model i want into Colab and start with it. (Now i have to load it with Stable XL model each time.)
What happened to that?
Anyone know?

Anyhow thanks all

@Azura-13
Copy link
Author

Now when trying to run colab, it gets to "Start Stable Diffusion" cell and just stops.
Then when you Restart and run all session, then it crashes.

Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1146, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 83, in
from accelerate import version as accelerate_version
File "/usr/local/lib/python3.11/dist-packages/accelerate/init.py", line 16, in
from .accelerator import Accelerator
File "/usr/local/lib/python3.11/dist-packages/accelerate/accelerator.py", line 34, in
from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 13, in
initialize.imports()
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/initialize.py", line 17, in imports
import pytorch_lightning # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/init.py", line 34, in
from pytorch_lightning.callbacks import Callback # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/init.py", line 14, in
from pytorch_lightning.callbacks.callback import Callback
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/callback.py", line 25, in
from pytorch_lightning.utilities.types import STEP_OUTPUT
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/utilities/types.py", line 28, in
from torchmetrics import Metric
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/init.py", line 14, in
from torchmetrics import functional # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/init.py", line 77, in
from torchmetrics.functional.text.bleu import bleu_score
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/init.py", line 30, in
from torchmetrics.functional.text.bert import bert_score # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/bert.py", line 24, in
from torchmetrics.functional.text.helper_embedding_metric import (
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/helper_embedding_metric.py", line 26, in
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1136, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1148, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

@Azura-13
Copy link
Author

  1. Delete SD Folder
  2. Run notebook from above link (supplied by Ben)
  3. Gradio loads (I have nothing installed, no model, no lora, etc)
  4. Restart and run all
  5. Cell 6 error and stops running

Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1146, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 83, in
from accelerate import version as accelerate_version
File "/usr/local/lib/python3.11/dist-packages/accelerate/init.py", line 16, in
from .accelerator import Accelerator
File "/usr/local/lib/python3.11/dist-packages/accelerate/accelerator.py", line 34, in
from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 13, in
initialize.imports()
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/initialize.py", line 17, in imports
import pytorch_lightning # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/init.py", line 34, in
from pytorch_lightning.callbacks import Callback # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/init.py", line 14, in
from pytorch_lightning.callbacks.callback import Callback
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/callback.py", line 25, in
from pytorch_lightning.utilities.types import STEP_OUTPUT
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/utilities/types.py", line 28, in
from torchmetrics import Metric
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/init.py", line 14, in
from torchmetrics import functional # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/init.py", line 77, in
from torchmetrics.functional.text.bleu import bleu_score
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/init.py", line 30, in
from torchmetrics.functional.text.bert import bert_score # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/bert.py", line 24, in
from torchmetrics.functional.text.helper_embedding_metric import (
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/helper_embedding_metric.py", line 26, in
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1136, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1148, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

Restarting the notebook should not cause it to crashes

@Azura-13
Copy link
Author

Part II

  1. Put everything back into SD (VAE, lora, models, afterdetailer, animatediff etc)
  2. Fresh session, Fresh T4
  3. run notebook
  4. cell 7 (start stable diffusion) ticks as being completed and stops running

@Azura-13
Copy link
Author

Part III

  1. Repeat above
  2. after cell 7 stops running with no message, restart and run all but from same session
  3. cell 7 gives error message and stops running (same message as always)
  4. i give up lol

Thanks anyway Ben, appreciate the response =)

Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1146, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/usr/local/lib/python3.11/dist-packages/transformers/modeling_utils.py", line 83, in
from accelerate import version as accelerate_version
File "/usr/local/lib/python3.11/dist-packages/accelerate/init.py", line 16, in
from .accelerator import Accelerator
File "/usr/local/lib/python3.11/dist-packages/accelerate/accelerator.py", line 34, in
from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/webui.py", line 13, in
initialize.imports()
File "/content/gdrive/MyDrive/sd/stable-diffusion-webui/modules/initialize.py", line 17, in imports
import pytorch_lightning # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/init.py", line 34, in
from pytorch_lightning.callbacks import Callback # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/init.py", line 14, in
from pytorch_lightning.callbacks.callback import Callback
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/callbacks/callback.py", line 25, in
from pytorch_lightning.utilities.types import STEP_OUTPUT
File "/usr/local/lib/python3.11/dist-packages/pytorch_lightning/utilities/types.py", line 28, in
from torchmetrics import Metric
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/init.py", line 14, in
from torchmetrics import functional # noqa: E402
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/init.py", line 77, in
from torchmetrics.functional.text.bleu import bleu_score
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/init.py", line 30, in
from torchmetrics.functional.text.bert import bert_score # noqa: F401
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/bert.py", line 24, in
from torchmetrics.functional.text.helper_embedding_metric import (
File "/usr/local/lib/python3.11/dist-packages/torchmetrics/functional/text/helper_embedding_metric.py", line 26, in
from transformers import AutoModelForMaskedLM, AutoTokenizer, PreTrainedModel, PreTrainedTokenizerBase
File "", line 1229, in _handle_fromlist
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1136, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/transformers/utils/import_utils.py", line 1148, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/usr/local/lib/python3.11/dist-packages/huggingface_hub/init.py)

@Azura-13
Copy link
Author

Tried again today same results.
Does anyone know what the error code i keep getting is?
(it's above)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants