Skip to content

chore: update release versions for 2.6 #3380

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Feb 4, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions MODULE.bazel
Original file line number Diff line number Diff line change
Expand Up @@ -55,21 +55,21 @@ http_archive(
name = "libtorch",
build_file = "@//third_party/libtorch:BUILD",
strip_prefix = "libtorch",
urls = ["https://download.pytorch.org/libtorch/test/cu126/libtorch-cxx11-abi-shared-with-deps-latest.zip"],
urls = ["https://download.pytorch.org/libtorch/cu126/libtorch-cxx11-abi-shared-with-deps-2.6.0%2Bcu126.zip"],
)

http_archive(
name = "libtorch_pre_cxx11_abi",
build_file = "@//third_party/libtorch:BUILD",
strip_prefix = "libtorch",
urls = ["https://download.pytorch.org/libtorch/test/cu126/libtorch-shared-with-deps-latest.zip"],
urls = ["https://download.pytorch.org/libtorch/cu126/libtorch-shared-with-deps-2.6.0%2Bcu126.zip"],
)

http_archive(
name = "libtorch_win",
build_file = "@//third_party/libtorch:BUILD",
strip_prefix = "libtorch",
urls = ["https://download.pytorch.org/libtorch/test/cu126/libtorch-win-shared-with-deps-latest.zip"],
urls = ["https://download.pytorch.org/libtorch/cu126/libtorch-win-shared-with-deps-2.6.0%2Bcu126.zip"],
)

# Download these tarballs manually from the NVIDIA website
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,8 +117,8 @@ auto results = trt_mod.forward({input_tensor});
These are the following dependencies used to verify the testcases. Torch-TensorRT can work with other versions, but the tests are not guaranteed to pass.

- Bazel 6.3.2
- Libtorch 2.5.0.dev (latest nightly) (built with CUDA 12.4)
- CUDA 12.4
- Libtorch 2.6.0 (built with CUDA 12.6)
- CUDA 12.6
- TensorRT 10.7.0.23

## Deprecation Policy
Expand Down
4 changes: 2 additions & 2 deletions docker/dist-build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ set -x
TOP_DIR=$(cd $(dirname $0); pwd)/..

if [[ -z "${USE_CXX11}" ]]; then
BUILD_CMD="python -m pip wheel . --extra-index-url https://download.pytorch.org/whl/test/cu126 -w dist"
BUILD_CMD="python -m pip wheel . --extra-index-url https://download.pytorch.org/whl/cu126 -w dist"
else
BUILD_CMD="python -m pip wheel . --config-setting="--build-option=--use-cxx11-abi" --extra-index-url https://download.pytorch.org/whl/test/cu126 -w dist"
BUILD_CMD="python -m pip wheel . --config-setting="--build-option=--use-cxx11-abi" --extra-index-url https://download.pytorch.org/whl/cu126 -w dist"
fi

# TensorRT restricts our pip version
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Compile Advanced Usage
======================================================

This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling GPT2 using the dynamo backend
==========================================================

This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Export with Cudagraphs
======================================================

This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well."""
This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet using the Torch-TensorRT Dyanmo Frontend
==========================================================

This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling Llama2 using the dynamo backend
==========================================================

This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet with dynamic shapes using the `torch.compile` backend
==========================================================

This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling BERT using the `torch.compile` backend
==============================================================

This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Dynamo Compile Advanced Usage
======================================================

This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling a Transformer using torch.compile and TensorRT
==============================================================

This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet using the Torch-TensorRT Dyanmo Frontend
==========================================================

This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Dynamo Compile Advanced Usage
======================================================

This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.dynamo.compile` works, and how it integrates with the new `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling a Transformer using torch.compile and TensorRT
==============================================================

This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model."""
This interactive script is intended as a sample of the `torch_tensorrt.dynamo.compile` workflow on a transformer-based model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_compile_advanced_usage.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Compile Advanced Usage
======================================================

This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API."""
This interactive script is intended as an overview of the process by which `torch_tensorrt.compile(..., ir="torch_compile", ...)` works, and how it integrates with the `torch.compile` API.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_compile_resnet_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling ResNet with dynamic shapes using the `torch.compile` backend
==========================================================

This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a ResNet model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_compile_transformers_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling BERT using the `torch.compile` backend
==============================================================

This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model."""
This interactive script is intended as a sample of the Torch-TensorRT workflow with `torch.compile` on a BERT model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_export_cudagraphs.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Torch Export with Cudagraphs
======================================================

This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well."""
This interactive script is intended as an overview of the process by which the Torch-TensorRT Cudagraphs integration can be used in the `ir="dynamo"` path. The functionality works similarly in the `torch.compile` path as well.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_export_gpt2.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling GPT2 using the dynamo backend
==========================================================

This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular GPT2 model.
"""

# %%
# Imports and Model Definition
Expand Down
3 changes: 2 additions & 1 deletion examples/dynamo/torch_export_llama2.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,8 @@
Compiling Llama2 using the dynamo backend
==========================================================

This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model."""
This script illustrates Torch-TensorRT workflow with dynamo backend on popular Llama2 model.
"""

# %%
# Imports and Model Definition
Expand Down
6 changes: 3 additions & 3 deletions py/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
numpy
packaging
pybind11==2.6.2
--extra-index-url https://download.pytorch.org/whl/test/cu126
torch>=2.6.0.dev,<2.7.0
torchvision>=0.20.0.dev,<0.22.0
--extra-index-url https://download.pytorch.org/whl/cu126
torch==2.6.0
torchvision==0.21.0
--extra-index-url https://pypi.ngc.nvidia.com
pyyaml
transformers==4.40.2
2 changes: 1 addition & 1 deletion py/torch_tensorrt/_Input.py
Original file line number Diff line number Diff line change
Expand Up @@ -261,7 +261,7 @@ def _supported_input_size_type(input_size: Any) -> bool:

@staticmethod
def _parse_tensor_domain(
domain: Optional[Tuple[float, float]]
domain: Optional[Tuple[float, float]],
) -> Tuple[float, float]:
"""
Produce a tuple of integers which specifies a tensor domain in the interval format: [lo, hi)
Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/_enums.py
Original file line number Diff line number Diff line change
Expand Up @@ -1200,7 +1200,7 @@ def _from(

@classmethod
def try_from(
c: Union[trt.EngineCapability, EngineCapability]
c: Union[trt.EngineCapability, EngineCapability],
) -> Optional[EngineCapability]:
"""Create a Torch-TensorRT engine capability enum from a TensorRT engine capability enum.

Expand Down
6 changes: 3 additions & 3 deletions py/torch_tensorrt/dynamo/conversion/_TRTBuilderMonitor.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,13 +53,13 @@ def _redraw(self, *, blank_lines: int = 0) -> None:
if self._render:

def clear_line() -> None:
print("\x1B[2K", end="")
print("\x1b[2K", end="")

def move_to_start_of_line() -> None:
print("\x1B[0G", end="")
print("\x1b[0G", end="")

def move_cursor_up(lines: int) -> None:
print("\x1B[{}A".format(lines), end="")
print("\x1b[{}A".format(lines), end="")

def progress_bar(steps: int, num_steps: int) -> str:
INNER_WIDTH = 10
Expand Down
4 changes: 2 additions & 2 deletions py/torch_tensorrt/dynamo/conversion/impl/activation/ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -247,7 +247,7 @@ def hard_sigmoid(
operation_type = trt.ActivationType.HARD_SIGMOID

def hard_sigmoid_dyn_range_fn(
dyn_range: Tuple[float, float]
dyn_range: Tuple[float, float],
) -> Tuple[float, float]:
def hard_sigmoid_fn(x: float) -> float:
return max(0, min(1, alpha * x + beta))
Expand Down Expand Up @@ -310,7 +310,7 @@ def thresholded_relu(
operation_type = trt.ActivationType.THRESHOLDED_RELU

def thresholded_relu_dyn_range_fn(
dyn_range: Tuple[float, float]
dyn_range: Tuple[float, float],
) -> Tuple[float, float]:
def thresholded_relu_fn(x: float) -> float:
return x if x > alpha else 0
Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/dynamo/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -465,7 +465,7 @@ def to_torch_device(device: Optional[Union[Device, torch.device, str]]) -> torch


def to_torch_tensorrt_device(
device: Optional[Union[Device, torch.device, str]]
device: Optional[Union[Device, torch.device, str]],
) -> Device:
"""Cast a device-type to torch_tensorrt.Device

Expand Down
2 changes: 1 addition & 1 deletion py/torch_tensorrt/fx/test/converters/acc_op/test_where.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ def __init__(self, x_shape, y_shape):
def forward(self, condition):
return torch.where(condition, self.x, self.y)

inputs = [(torch.randn(condition_shape) > 0)]
inputs = [torch.randn(condition_shape) > 0]
self.run_test(
Where(x_shape, y_shape),
inputs,
Expand Down
5 changes: 2 additions & 3 deletions py/torch_tensorrt/fx/tracer/acc_tracer/acc_tracer.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
from typing import (
Any,
Callable,
cast,
Dict,
Iterable,
Optional,
Expand All @@ -19,6 +18,7 @@
Tuple,
Type,
Union,
cast,
)

import torch
Expand All @@ -32,7 +32,6 @@

from . import acc_normalizer, acc_ops, acc_shape_prop, acc_utils # noqa: F401


_LOGGER = logging.getLogger(__name__)


Expand Down Expand Up @@ -517,7 +516,7 @@ def _replace_transpose_last_dims_impl(
changed = False

def _calculate_dim(
transpose_dim: Union[torch.fx.Node, int]
transpose_dim: Union[torch.fx.Node, int],
) -> Union[torch.fx.Node, int]:
nonlocal transpose_input_node
nonlocal changed
Expand Down
4 changes: 2 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ requires = [
"typing-extensions>=4.7.0",
"future>=0.18.3",
"tensorrt-cu12>=10.7.0.post1,<10.8.0",
"torch>=2.6.0.dev,<2.7.0",
"torch==2.6.0",
"pybind11==2.6.2",
"numpy",
]
Expand Down Expand Up @@ -54,7 +54,7 @@ keywords = [
"inference",
]
dependencies = [
"torch>=2.6.0.dev,<2.7.0",
"torch==2.6.0",
"tensorrt>=10.7.0.post1,<10.8.0",
"tensorrt-cu12>=10.7.0.post1,<10.8.0",
"tensorrt-cu12-bindings>=10.7.0,<10.8.0",
Expand Down