Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BugFix][Ansor] Fixing Ansor Gradient Bug #16739

Merged
merged 3 commits into from
Apr 1, 2024
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Fixing ansor gradient bug
  • Loading branch information
thaisacs committed Mar 19, 2024
commit fcf8e0350f60212ff895b39a60e5ff1d91b15d16
7 changes: 7 additions & 0 deletions python/tvm/auto_scheduler/task_scheduler.py
Original file line number Diff line number Diff line change
Expand Up @@ -367,6 +367,10 @@ def tune(
task_idx = (task_idx + 1) % len(self.tasks)
elif self.strategy == "gradient":
gradients = []

# fix gradient for task without schedule in warm up
self.best_costs[(self.best_costs == 1e10)] = 0

for i in range(len(self.tasks)):
if i in self.dead_tasks:
gradients.append(0)
Expand Down Expand Up @@ -418,6 +422,9 @@ def tune(
assert grad <= 0
gradients.append(grad)

# fix gradient for task without schedule in warm up
self.best_costs[(self.best_costs == 0)] = 1e10

if max(gradients) == min(gradients):
task_idx = np.random.choice(len(gradients))
else:
Expand Down
Loading