Skip to content

batched operation in pytorch pnp node #16

@flandrewries

Description

@flandrewries

hello,
I'm using the release code of your pnp_node.py,
as my inputs are batched points, each one with different pose , so i would like to use this operation:

    # # Alternatively, disentangle batch element optimization:
    # for i in range(p2d.size(0)):
    #     Ki = K[i:(i+1),...] if K is not None else None
    #     theta[i, :] = self._run_optimization(p2d[i:(i+1),...],
    #         p3d[i:(i+1),...], w[i:(i+1),...], Ki, y=theta[i:(i+1),...])

however, i find the upper level function dose not update the w value.
I printed the theta.grad to check whether the gradient is calculated, and find that theta[i:(i+1),...].grad is None.
maybe when the optimization is done, the slice or copy ops will not copy the grad value.
Is there any way for solving this problem?

Very appreciate for your advice.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions