Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

confused when running on custom data #16

Open
rorrewang opened this issue Jan 11, 2025 · 1 comment
Open

confused when running on custom data #16

rorrewang opened this issue Jan 11, 2025 · 1 comment

Comments

@rorrewang
Copy link

Dear author, I am trying to use your code to run my own point cloud data. I have converted the point cloud file to .vg format using easy3d according to your tutorial, and also generated testlist.txt and other files. However, I still encounter many confusions when running test.py. Could you please provide a inference code or a more specific tutorial?

For now, I have some information outputs as :

(polygnn) wly@wly:/mnt/PC/polygnn$ HYDRA_FULL_ERROR=1 python test.py dataset=custom
WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

[2025-01-11 17:03:12,430][root][INFO] - Device initialized: CUDA: [0]
[2025-01-11 17:03:12,431][root][INFO] - Random seed set to 1117
Processing...
Preparing dataset:   0%|                                                                                                                      | 0/25 [00:00<?, ?it/s][2025-01-11 17:03:12,506][dataset][INFO] - processing 0001
[2025-01-11 17:03:12,506][dataset][INFO] - processing 0002
kwargs['cloud'] data/custom/raw/04_pts/0001.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0001.vg
kwargs['cloud'] data/custom/raw/04_pts/0002.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0002.vg
[2025-01-11 17:03:12,556][dataset][INFO] - processing 0005
[2025-01-11 17:03:12,556][dataset][INFO] - processing 0007
[2025-01-11 17:03:12,556][dataset][INFO] - processing 0004
[2025-01-11 17:03:12,556][dataset][INFO] - processing 0003
[2025-01-11 17:03:12,556][dataset][INFO] - processing 0008
[2025-01-11 17:03:12,556][dataset][INFO] - processing 0006
kwargs['cloud'] data/custom/raw/04_pts/0005.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0005.vg
kwargs['cloud'] data/custom/raw/04_pts/0004.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0004.vg
kwargs['cloud'] data/custom/raw/04_pts/0007.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0007.vg
kwargs['cloud'] data/custom/raw/04_pts/0008.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0008.vg
kwargs['cloud'] data/custom/raw/04_pts/0006.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0006.vg
kwargs['cloud'] data/custom/raw/04_pts/0003.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0003.vg
[2025-01-11 17:03:13,101][dataset][INFO] - processing 0009
kwargs['cloud'] data/custom/raw/04_pts/0009.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0009.vg
[2025-01-11 17:03:13,941][dataset][INFO] - processing 0010
kwargs['cloud'] data/custom/raw/04_pts/0010.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0010.vg
[2025-01-11 17:03:14,611][dataset][INFO] - processing 0011
kwargs['cloud'] data/custom/raw/04_pts/0011.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0011.vg
[2025-01-11 17:03:14,745][dataset][INFO] - processing 0013
kwargs['cloud'] data/custom/raw/04_pts/0013.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0013.vg
[2025-01-11 17:03:14,991][dataset][INFO] - processing 0014
kwargs['cloud'] data/custom/raw/04_pts/0014.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0014.vg
[2025-01-11 17:03:15,025][dataset][INFO] - processing 0015
kwargs['cloud'] data/custom/raw/04_pts/0015.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0015.vg
[2025-01-11 17:03:15,378][dataset][INFO] - processing 0017
kwargs['cloud'] data/custom/raw/04_pts/0017.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0017.vg
[2025-01-11 17:03:15,760][dataset][INFO] - processing 0018
kwargs['cloud'] data/custom/raw/04_pts/0018.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0018.vg
[2025-01-11 17:03:15,977][dataset][INFO] - processing 0019
kwargs['cloud'] data/custom/raw/04_pts/0019.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0019.vg
[2025-01-11 17:03:16,553][dataset][INFO] - processing 0024
kwargs['cloud'] data/custom/raw/04_pts/0024.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0024.vg
[2025-01-11 17:03:16,851][dataset][INFO] - processing 0025
kwargs['cloud'] data/custom/raw/04_pts/0025.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0025.vg
[2025-01-11 17:03:16,924][dataset][INFO] - processing 0028
kwargs['cloud'] data/custom/raw/04_pts/0028.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0028.vg
[2025-01-11 17:03:17,096][dataset][INFO] - processing 0029
kwargs['cloud'] data/custom/raw/04_pts/0029.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0029.vg
[2025-01-11 17:03:17,113][dataset][INFO] - processing 0030
kwargs['cloud'] data/custom/raw/04_pts/0030.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0030.vg
[2025-01-11 17:03:17,269][dataset][INFO] - processing 0031
kwargs['cloud'] data/custom/raw/04_pts/0031.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0031.vg
[2025-01-11 17:03:17,376][dataset][INFO] - processing 0036
kwargs['cloud'] data/custom/raw/04_pts/0036.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0036.vg
[2025-01-11 17:03:17,422][dataset][INFO] - processing 0040
kwargs['cloud'] data/custom/raw/04_pts/0040.npy
kwargs['vertex_group'] data/custom/raw/06_vg/0040.vg
Preparing dataset: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████| 25/25 [00:08<00:00,  3.04it/s]
Done!
WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

INFO     distributed_c10d.py:442   Added key: store_based_barrier_key:1 to store for rank: 0
INFO     distributed_c10d.py:476   Rank 0: Completed store-based barrier for key:store_based_barrier_key:1 with 1 nodes.
INFO               test.py:91    Resuming from ./checkpoints/custom/model_best.pth
eval:   0%|                                                                                                                                    | 0/1 [00:00<?, ?it/s]WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

WARNING        warnings.py:109   /home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/dash/_jupyter.py:29: DeprecationWarning: The `ipykernel.comm.Comm` class has been deprecated. Please use the `comm` module instead.For creating comms, use the function `from comm import create_comm`.
  _dash_comm = Comm(target_name="dash")

eval:   0%|                                                                                                                                    | 0/1 [00:08<?, ?it/s]
Process SpawnProcess-9:
Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap
    fn(i, *args)
  File "/mnt/PC/polygnn/test.py", line 112, in run_eval
    for batch in pbar:
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/tqdm/std.py", line 1181, in __iter__
    for obj in iterable:
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 633, in __next__
    data = self._next_data()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1345, in _next_data
    return self._process_data(data)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1371, in _process_data
    data.reraise()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/_utils.py", line 644, in reraise
    raise exception
TypeError: Caught TypeError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch_geometric/data/dataset.py", line 258, in __getitem__
    data = self.get(self.indices()[idx])
  File "/mnt/PC/polygnn/dataset.py", line 251, in get
    data.num_nodes = len(data.y)
TypeError: object of type 'NoneType' has no len()


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 76, in _wrap
    sys.exit(1)
SystemExit: 1

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/process.py", line 317, in _bootstrap
    util._exit_function()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/util.py", line 357, in _exit_function
    p.join()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/process.py", line 149, in join
    res = self._popen.wait(timeout)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/popen_fork.py", line 43, in wait
    return self.poll(os.WNOHANG if timeout == 0.0 else 0)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/popen_fork.py", line 27, in poll
    pid, sts = os.waitpid(self.pid, flag)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/signal_handling.py", line 66, in handler
    _error_if_any_worker_fails()
RuntimeError: DataLoader worker (pid 52827) is killed by signal: Terminated. 
Exception ignored in: <function Pool.__del__ at 0x7e623fce1fc0>
Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/pool.py", line 271, in __del__
    self._change_notifier.put(None)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/queues.py", line 377, in put
    self._writer.send_bytes(obj)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/connection.py", line 200, in send_bytes
    self._send_bytes(m[offset:offset + size])
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/connection.py", line 411, in _send_bytes
    self._send(header + buf)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
OSError: [Errno 9] Bad file descriptor
Error executing job with overrides: ['dataset=custom']
Traceback (most recent call last):
  File "/mnt/PC/polygnn/test.py", line 189, in <module>
    test()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/main.py", line 94, in decorated_main
    _run_hydra(
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/_internal/utils.py", line 394, in _run_hydra
    _run_app(
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/_internal/utils.py", line 457, in _run_app
    run_and_report(
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/_internal/utils.py", line 223, in run_and_report
    raise ex
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/_internal/utils.py", line 220, in run_and_report
    return func()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/_internal/utils.py", line 458, in <lambda>
    lambda: hydra.run(
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/_internal/hydra.py", line 132, in run
    _ = ret.return_value
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/core/utils.py", line 260, in return_value
    raise self._return_value
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/hydra/core/utils.py", line 186, in run_job
    ret.return_value = task_function(task_cfg)
  File "/mnt/PC/polygnn/test.py", line 185, in test
    mp.spawn(run_eval, args=(world_size, dataset, cfg), nprocs=world_size, join=True)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 239, in spawn
    return start_processes(fn, args, nprocs, join, daemon, start_method='spawn')
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 197, in start_processes
    while not context.join():
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 160, in join
    raise ProcessRaisedException(msg, error_index, failed_process.pid)
torch.multiprocessing.spawn.ProcessRaisedException: 

-- Process 0 terminated with the following error:
Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap
    fn(i, *args)
  File "/mnt/PC/polygnn/test.py", line 112, in run_eval
    for batch in pbar:
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/tqdm/std.py", line 1181, in __iter__
    for obj in iterable:
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 633, in __next__
    data = self._next_data()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1345, in _next_data
    return self._process_data(data)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1371, in _process_data
    data.reraise()
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/_utils.py", line 644, in reraise
    raise exception
TypeError: Caught TypeError in DataLoader worker process 0.
Original Traceback (most recent call last):
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
    data = fetcher.fetch(index)
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 51, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/home/wly/anaconda3/envs/polygnn/lib/python3.10/site-packages/torch_geometric/data/dataset.py", line 258, in __getitem__
    data = self.get(self.indices()[idx])
  File "/mnt/PC/polygnn/dataset.py", line 251, in get
    data.num_nodes = len(data.y)
TypeError: object of type 'NoneType' has no len()

I wonder what data.y is? I print the data and get:

data Data(edge_index=[2, 4], num_nodes=3, num_points=5350, points=[5350, 3], queries=[3, 16, 3], batch_points=[5350], train_mask=[3], val_mask=[3], test_mask=[3], name='0040')
@chenzhaiyu
Copy link
Owner

Hi @rorrewang, data.y are the labels, which you do not need for inference, unless you happen to have those labels and want to calculate metrics against them. In your case you may simply remove the relevant lines.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants