Skip to content

chore: bump torch to 2.7.0 #8013

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 19, 2025
Merged

chore: bump torch to 2.7.0 #8013

merged 2 commits into from
May 19, 2025

Conversation

psychedelicious
Copy link
Collaborator

Summary

  • Update pyproject.toml
  • Update pins.json so launcher installs latest CUDA 12.8 & ROCm 6.3

Related Issues / Discussions

I've tested MPS on macOS and CUDA on Linux briefly. FLUX, SDXL and SD1.5 - no obvious issues.

Quite a few users have been testing 2.7.0 on Windows w/ 50xx series GPUs without any issues AFAIK. So I think we are safe to bump this dependency.

QA Instructions

Test more?

Merge Plan

n/a

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • Documentation added / updated (if applicable)
  • Updated What's New copy (if doing a release after this PR)

@github-actions github-actions bot added Root python-deps PRs that change python dependencies labels May 19, 2025
@psychedelicious psychedelicious enabled auto-merge (rebase) May 19, 2025 02:23
- Update `pyproject.toml`
- Update `pins.json` so launcher installs latest CUDA 12.8 & ROCm 6.3
@psychedelicious psychedelicious force-pushed the psyche/chore/bump-torch branch from 7d6987a to 76b920d Compare May 19, 2025 02:23
@psychedelicious psychedelicious merged commit 8a7a498 into main May 19, 2025
12 checks passed
@psychedelicious psychedelicious deleted the psyche/chore/bump-torch branch May 19, 2025 02:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
python-deps PRs that change python dependencies Root
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants