You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- Reduce AST threshold from 40 to 32 (classic is 28)
- update point formula to match classic computation
- don't penalize extra for identical duplication
Change from
remediations_points = x * score
to
remediation_points = x + (score-threshold) * y
This change increases parity with classic and overall
Note on mass difference:
The mass of node corresponds to its size. Specifying a minimum threshold
tells Code Climate to ignore duplication in nodes below a certain size
(e.g. one liners).
Comparing issue **mass** between parser in Platform and Classic:
| Platform | Classic | Platform / Classic |
------------ | --------------- | ------------
42 | 39 | 1.07
45 | 40 | 1.125
66 | 57 | 1.15789
123 | 109 | 1.1284
126 | 93 | 1.3548
246 | 218 | 1.1284
I've estimated the factor of mass difference to be ~ 1.15
Since the default Python duplication mass threshold on Classic was 28,
and `28 * 1.15 = 32.19999`, I've lowered our current default threshold
for Python on Platform from 40 to 32.
On Classic, Python duplication issues were penalized in terms of
remediation points as follows:
`1_500_000 + overage * 50_000`
where overage = **score** - **threshold**
( and score = mass for similar code, and mass * num_occurrences for
identical code)
Since remediation points are a function of effort required to fix an
issue, we're making a behavioral change to not penalize extra for
identical duplication.
0 commit comments