Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add exportable baby llama example #4345

Closed
wants to merge 1 commit into from
Closed

Conversation

mcremon-meta
Copy link
Contributor

Summary: Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Differential Revision: D60073137

Copy link

pytorch-bot bot commented Jul 22, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/4345

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit d9ab717 with merge base 5a20a49 (image):

FLAKY - The following job failed but was likely due to flakiness present on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 22, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

mcremon-meta added a commit that referenced this pull request Jul 23, 2024
Summary:

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Differential Revision: D60073137
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

facebook-github-bot pushed a commit that referenced this pull request Jul 24, 2024
Summary:

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

facebook-github-bot pushed a commit that referenced this pull request Jul 25, 2024
Summary:

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

mcremon-meta added a commit that referenced this pull request Jul 25, 2024
Summary:
Pull Request resolved: #4345

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
facebook-github-bot pushed a commit that referenced this pull request Jul 26, 2024
Summary:

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

mcremon-meta added a commit that referenced this pull request Jul 26, 2024
Summary:
Pull Request resolved: #4345

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
facebook-github-bot pushed a commit that referenced this pull request Jul 26, 2024
Summary:

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

Summary:
Pull Request resolved: #4345

Add a small LLaMa model, based on the babyllama paper. Note that this test case is only one layer by default, and the number of layers can be adjusted in the test.

Removed some pyre changes that broke the OSS AoT export, and added some required passes and operators.

Differential Revision: D60073137
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D60073137

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 1e14333.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants