We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[rank1]: File "/root/download/D-FINE/src/zoo/dfine/hybrid_encoder.py", line 267, in forward [rank1]: src, _ = self.self_attn(q, k, value=src, attn_mask=src_mask) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl [rank1]: return self._call_impl(*args, **kwargs) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1844, in _call_impl [rank1]: return inner() [rank1]: ^^^^^^^ [rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1790, in inner [rank1]: result = forward_call(*args, **kwargs) [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/activation.py", line 1368, in forward [rank1]: attn_output, attn_output_weights = F.multi_head_attention_forward( [rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/functional.py", line 6087, in multi_head_attention_forward [rank1]: key.shape == value.shape [rank1]: AssertionError: key shape torch.Size([400, 1, 256]) does not match value shape torch.Size([1, 1, 256])
The text was updated successfully, but these errors were encountered:
Have you tried starting your custom dataset annotations with category id 0 instead of 1?
Sorry, something went wrong.
No branches or pull requests
[rank1]: File "/root/download/D-FINE/src/zoo/dfine/hybrid_encoder.py", line 267, in forward
[rank1]: src, _ = self.self_attn(q, k, value=src, attn_mask=src_mask)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl
[rank1]: return self._call_impl(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1844, in _call_impl
[rank1]: return inner()
[rank1]: ^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1790, in inner
[rank1]: result = forward_call(*args, **kwargs)
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/modules/activation.py", line 1368, in forward
[rank1]: attn_output, attn_output_weights = F.multi_head_attention_forward(
[rank1]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank1]: File "/root/miniconda3/lib/python3.12/site-packages/torch/nn/functional.py", line 6087, in multi_head_attention_forward
[rank1]: key.shape == value.shape
[rank1]: AssertionError: key shape torch.Size([400, 1, 256]) does not match value shape torch.Size([1, 1, 256])
The text was updated successfully, but these errors were encountered: