Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading python 3.7 version model with python 3.9 version #9

Open
hanyangii opened this issue Sep 30, 2024 · 0 comments
Open

Loading python 3.7 version model with python 3.9 version #9

hanyangii opened this issue Sep 30, 2024 · 0 comments
Assignees

Comments

@hanyangii
Copy link
Collaborator

(methylbert_39) python3 test_deconvolute.py 
Building Vocab
Total number of sequences :  612
# of reads in each label:  [319. 293.]
No detected GPU device. Load the model on CPU
The model is loaded on CPU
Restore the pretrained model tmp/bert.model/
You passed along `num_labels=20` with an incompatible id to label map: {'0': 'LABEL_0', '1': 'LABEL_1'}. The number of labels wil be overwritten to -1.
Cross entropy loss assigned
Traceback (most recent call last):
  File "/omics/groups/OE0219/internal/Yunhee/DL_project/methylbert/test/test_deconvolute.py", line 48, in <module>
    trainer.load(model_dir)
  File "/omics/groups/OE0219/internal/Yunhee/anaconda3/envs/methylbert_39/lib/python3.9/site-packages/methylbert/trainer.py", line 646, in load
    self.bert = MethylBertEmbeddedDMR.from_pretrained(dir_path, 
  File "/omics/groups/OE0219/internal/Yunhee/anaconda3/envs/methylbert_39/lib/python3.9/site-packages/transformers/modeling_utils.py", line 3960, in from_pretrained
    ) = cls._load_pretrained_model(
  File "/omics/groups/OE0219/internal/Yunhee/anaconda3/envs/methylbert_39/lib/python3.9/site-packages/transformers/modeling_utils.py", line 4492, in _load_pretrained_model
    raise RuntimeError(f"Error(s) in loading state_dict for {model.__class__.__name__}:\n\t{error_msg}")
RuntimeError: Error(s) in loading state_dict for MethylBertEmbeddedDMR:
        size mismatch for dmr_encoder.0.weight: copying a param with shape torch.Size([20, 151]) from checkpoint, the shape in current model is torch.Size([10, 151]).
        You may consider adding `ignore_mismatched_sizes=True` in the model `from_pretrained` method.

Run "test_deconvolute.py" with the previous model directory

@hanyangii hanyangii self-assigned this Oct 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant