Skip to content

Commit

Permalink
now maybe
Browse files Browse the repository at this point in the history
  • Loading branch information
pat-alt committed Oct 30, 2024
1 parent 0199263 commit e366cfc
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 0 deletions.
8 changes: 8 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,14 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),

## Version [1.3.6]

### Changed

- Slight changes to the implementation of `ProbeGenerator` (no longer calling a redundant `hinge_loss` function for all other generators).

### Added

- Added a warning message to the `ProbeGenerator` pointing to the issues with with current implementation.

## Version [1.3.5] - 2024-10-28

### Changed
Expand Down
1 change: 1 addition & 0 deletions src/generators/gradient_based/generators.jl
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,7 @@ function ProbeGenerator(;
penalty=[Objectives.distance_l1, Objectives.hinge_loss],
kwargs...,
)
@warn "The `ProbeGenerator` is currenlty not working adequately. In particular, gradients are not computed with respect to the Hinge loss term proposed in the paper. It is still possible, however, to use this generator to achieve a desired invalidation rate. See issue [#376](https://github.com/JuliaTrustworthyAI/CounterfactualExplanations.jl/issues/376) for details."
user_loss = Objectives.losses_catalogue[loss]
return GradientBasedGenerator(; loss=user_loss, penalty=penalty, λ=λ, kwargs...)
end
5 changes: 5 additions & 0 deletions src/generators/gradient_based/utils.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,11 @@ By default, gradient-based search is considered to have converged as soon as the
function Convergence.conditions_satisfied(
generator::AbstractGradientBasedGenerator, ce::AbstractCounterfactualExplanation
)
if !(ce.convergence isa Convergence.GeneratorConditionsConvergence)
# Temporary fix due to the fact that `ProbeGenerator` relies on `InvalidationRateConvergence`.
@warn "Checking for generator conditions convergence is not implemented for this generator type. Return `false`." maxlog=1
return false

Check warning on line 22 in src/generators/gradient_based/utils.jl

View check run for this annotation

Codecov / codecov/patch

src/generators/gradient_based/utils.jl#L21-L22

Added lines #L21 - L22 were not covered by tests
end
Δcounterfactual_state = (generator, ce)
Δcounterfactual_state = CounterfactualExplanations.apply_mutability(
ce, Δcounterfactual_state
Expand Down

0 comments on commit e366cfc

Please sign in to comment.