Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IndexError #3

Open
always-anything opened this issue Jan 8, 2025 · 3 comments
Open

IndexError #3

always-anything opened this issue Jan 8, 2025 · 3 comments

Comments

@always-anything
Copy link

Thank you for your great work! I'm trying your pipeline to reconstrcut the dtu dataset, but failed. Could you please help me to solve this? I have changed a few datasets but they all came into this problem.
Running..
/home/star/software/anaconda3/envs/fds/lib/python3.9/site-packages/torch/init.py:690: UserWarning: torch.set_default_tensor_type() is deprecated as of PyTorch 2.1, please use torch.set_default_dtype() and torch.set_default_device() as alternatives. (Triggered internally at ../torch/csrc/tensor/python_tensor.cpp:451.)
_C._set_default_tensor_type(t)
Load data: Begin
49
Load data: End
total points: 996943
kept points: 69092
Estimating areas for 69092 points
building octree
initializing octree
/home/star/software/anaconda3/envs/fds/lib/python3.9/site-packages/torch/nn/utils/weight_norm.py:28: UserWarning: torch.nn.utils.weight_norm is deprecated in favor of torch.nn.utils.parametrizations.weight_norm.
warnings.warn("torch.nn.utils.weight_norm is deprecated in favor of torch.nn.utils.parametrizations.weight_norm.")
0%| | 0/20000 [00:00<?, ?it/s]Validate: iter: 1, camera: 44
/home/star/software/anaconda3/envs/fds/lib/python3.9/site-packages/torch/functional.py:507: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at ../aten/src/ATen/native/TensorShape.cpp:3549.)
return _VF.meshgrid(tensors, **kwargs) # type: ignore[attr-defined]
threshold: 0.0
[export.py:78 - export_mesh() ] Exporting 938764 faces as PLY
[exp_runner.py:509 - validate_mesh() ] End
0%|▎ | 46/20000 [00:16<2:00:23, 2.76it/s]
Traceback (most recent call last):
File "/home/star/xzy/fast_dipole_sums/exp_runner.py", line 584, in
runner.train()
File "/home/star/xzy/fast_dipole_sums/exp_runner.py", line 173, in train
data = self.dataset.gen_random_rays_at(image_idx, self.batch_size)
File "/home/star/xzy/fast_dipole_sums/models/dataset.py", line 121, in gen_random_rays_at
p = torch.matmul(self.intrinsics_all_inv[img_idx, None, :3, :3], p[:, :, None]).squeeze() # batch_size, 3
IndexError: index 48 is out of bounds for dimension 0 with size 48

@hanyucc
Copy link
Collaborator

hanyucc commented Jan 12, 2025

Hmm this is strange. Which scene have you tried running on? I just ran the code on a couple DTU scenes and it seems to be working fine. The length of self.intrinsics_all_inv should be equal to 64, which is the number of images in each DTU scene, but it seems like it's 48 in your error message

@always-anything
Copy link
Author

Hello and thank you for your attention! I just redownload the DTU dataset again from the link you provided and find out that part of DTU dataset only have 49 images while others have 64. However, when I tried my costom data, the same error happens. Is there any parameters needs to change from different datasets? Or can I just simply change the self.intrinsics_all_inv to ensure this parameter keeps the same with the number of images? Thank you again for your reply!

@hanyucc
Copy link
Collaborator

hanyucc commented Jan 13, 2025

I see. Indeed half of the scenes have 49 images and half of them have 64 images. However, the number of items cameras_sphere.npz contains should be consistent with the number of images. I haven't been able to reproduce this issue: I tried running on scan 24 (which has 49 images) and scan 110 (which has 64 images) and they seem to be working fine. Which scenes are you having issues with?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants