Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Torch Error ? #78

Open
XTRMsavage opened this issue Jun 6, 2024 · 0 comments
Open

Torch Error ? #78

XTRMsavage opened this issue Jun 6, 2024 · 0 comments

Comments

@XTRMsavage
Copy link

After I run the installation and run the GradioApp I get this error. How do I resolve it?

C:\Users\admin\anaconda3\envs\omost\lib\site-packages\transformers\models\llama\modeling_llama.py:649: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:455.)
attn_output = torch.nn.functional.scaled_dot_product_attention(
Exception in thread Thread-9 (generate):
Traceback (most recent call last):
File "C:\Users\admin\anaconda3\envs\omost\lib\threading.py", line 1016, in _bootstrap_inner
self.run()
File "C:\Users\admin\anaconda3\envs\omost\lib\threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\transformers\generation\utils.py", line 1758, in generate
result = self._sample(
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\transformers\generation\utils.py", line 2397, in _sample
outputs = self(
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **kwargs)
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\accelerate\hooks.py", line 166, in new_forward
output = module._old_forward(*args, **kwargs)
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\transformers\models\llama\modeling_llama.py", line 1164, in forward
outputs = self.model(
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\torch\nn\modules\module.py", line 1532, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "C:\Users\admin\anaconda3\envs\omost\lib\site-packages\torch\nn\modules\module.py", line 1541, in _call_impl
return forward_call(*args, **

Thanks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant