Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test on point clouds with < 100k points #3

Open
weijiawang96 opened this issue Dec 19, 2021 · 1 comment
Open

Test on point clouds with < 100k points #3

weijiawang96 opened this issue Dec 19, 2021 · 1 comment

Comments

@weijiawang96
Copy link

weijiawang96 commented Dec 19, 2021

Dear author,

I wish to test point clouds with less than 100k points. Is there a way to pre-process point clouds with < 100k points to h5py? How should I change the script?

Thanks in advance!

@ziruiw-dev
Copy link
Collaborator

Hi @weijiawang96,

Our method does not require more than 100k points as input. It's more a dataloader and preprocessing setup for the PCPNet dataset specifically. You can check out this script to see how we preprocess for the PCPNet data. Our preprocessing code for PCPNet is a little bit tedious as it was designed to adapt the PCPNet dataset, so you don't have to follow it.

To load your data, you can write your preprocessing code and dataloader, as long as the new dataloader provides input data with the same dict keys and the same dimensions as our dataloader. In this way, you can preprocess your data however you want.

One thing to note is the provided pretrained checkpoints are trained on data with 100k points per object. So when you switch to point clouds with different densities, you probably need to re-train the network to get the best performance on your data.

Best,
Zirui

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants