Skip to content

[Executorch][llm] Enable local global attention in export_llama script #10836

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

pytorchbot
Copy link
Collaborator

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #10612 by @kimishpatel
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/kimishpatel/189/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/kimishpatel/189/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/gh/kimishpatel/188/orig
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/kimishpatel/189/orig
@diff-train-skip-merge

Copy link

pytorch-bot bot commented May 13, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10836

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 13 New Failures

As of commit 7ee018b with merge base ef30b25 (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 13, 2025
@kimishpatel kimishpatel force-pushed the gh/kimishpatel/188/orig branch from ed32169 to 084bfe2 Compare May 13, 2025 16:17
Base automatically changed from gh/kimishpatel/188/orig to main May 13, 2025 17:38
Pull Request resolved: #10612

Added a new option of --local_global_attention that takes in pattern of sizes to determine which layers are using local sliding window attention.
For example, [0, 256, 256, 0, 256, 256] can be used for 6 layers transformer. Or you can also use [0, 256, 256] as pattern you want
to repeat.
ghstack-source-id: 283404674
@exported-using-ghexport

Differential Revision: [D73891423](https://our.internmc.facebook.com/intern/diff/D73891423/)
@kimishpatel kimishpatel force-pushed the gh/kimishpatel/189/orig branch from 7fddb24 to 7ee018b Compare May 13, 2025 17:52
Copy link

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:.

If not, please add the release notes: none label.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants