Skip to content

Commit

Permalink
Add filter block to filter iterations (#1927)
Browse files Browse the repository at this point in the history
Follow-up after #1298
  • Loading branch information
Vampire authored Apr 22, 2024
1 parent 536d326 commit 1f3b694
Show file tree
Hide file tree
Showing 28 changed files with 600 additions and 28 deletions.
17 changes: 17 additions & 0 deletions docs/data_driven_testing.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -442,6 +442,23 @@ yield different numbers of iterations. If a data provider runs out of values soo
Variable assignments don't affect the number of iterations. A `where:` block that only contains assignments yields
exactly one iteration.

== Filtering iterations

If you want to filter out some iterations, you can use the `@IgnoreIf` annotation on the feature method.
This has one significant drawback though, the iteration would be reported as skipped in test reports.
Therefor you can have a `filter` block after the `where` block.
The content of this block is treated like the content of the `exepct` block.
If any of the implicit or explicit assertions in the `filter` block fails, the iteration is treated like it would not exist.
This also means, that if all iterations are filtered out, the test will fail like when giving a data provider without content.

In the following example the test is executed with the values `1`, `2`, `4`, and `5` for the variable `i`,
the iteration where `i` would be `3` is filtered out by the `filter` block:

[source,groovy,indent=0]
----
include::{sourcedir}/datadriven/DataSpec.groovy[tag=excluding-iterations]
----

== Closing of Data Providers

After all iterations have completed, the zero-argument `close` method is called on all data providers that have
Expand Down
1 change: 1 addition & 0 deletions docs/release_notes.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ include::include.adoc[]
=== Highlights

* Add support for combining two or more data providers using cartesian product spockIssue:1062[]
* Add support for a `filter` block after a `where` block to filter out unwanted iterations

== 2.4-M4 (2024-03-21)

Expand Down
19 changes: 17 additions & 2 deletions docs/spock_primer.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ interacting feature methods), and may occur more than once.

Spock has built-in support for implementing each of the conceptual phases of a feature method. To this end, feature
methods are structured into so-called _blocks_. Blocks start with a label, and extend to the beginning of the next block,
or the end of the method. There are six kinds of blocks: `given`, `when`, `then`, `expect`, `cleanup`, and `where` blocks.
or the end of the method. There are seven kinds of blocks: `given`, `when`, `then`, `expect`, `cleanup`, `where`, and `filter` blocks.
Any statements between the beginning of the method and the first explicit block belong to an implicit `given` block.

A feature method must have at least one explicit (i.e. labelled) block - in fact, the presence of an explicit block is
Expand Down Expand Up @@ -483,7 +483,7 @@ TIP: If a specification is designed in such a way that all its feature methods r

==== Where Blocks

A `where` block always comes last in a method, and may not be repeated. It is used to write data-driven feature methods.
A `where` block may only be followed by a `filter` block, and may not be repeated. It is used to write data-driven feature methods.
To give you an idea how this is done, have a look at the following example:

[source,groovy]
Expand All @@ -506,6 +506,21 @@ Although it is declared last, the `where` block is evaluated before the feature

The `where` block is further explained in the <<data_driven_testing.adoc#data-driven-testing,Data Driven Testing>> chapter.

==== Filter Blocks

A `filter` block always comes last in a method, and may not be repeated. It is used to filter iterations in data-driven feature methods.
To give you an idea how this is done, have a look at the following example:

[source,groovy,indent=0]
----
include::{sourcedir}/datadriven/DataSpec.groovy[tag=excluding-iterations]
----

The content of the `filter` block is treated like the content of an `expect` block. If any of the implicit or explicit
assertions in it fail for a given iteration, this iteration is skipped.

The `filter` block is further explained in the <<data_driven_testing.adoc#data-driven-testing,Data Driven Testing>> chapter.

== Helper Methods

Sometimes feature methods grow large and/or contain lots of duplicated code. In such cases it can make sense to introduce
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -46,4 +46,6 @@ public void visitThenBlock(ThenBlock block) throws Exception {}
public void visitCleanupBlock(CleanupBlock block) throws Exception {}
@Override
public void visitWhereBlock(WhereBlock block) throws Exception {}
@Override
public void visitFilterBlock(FilterBlock block) throws Exception {}
}
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@ private boolean handleInteraction(InteractionRewriter rewriter, ExpressionStatem
}

private boolean handleImplicitCondition(ExpressionStatement stat) {
if (!(stat == currTopLevelStat && isThenOrExpectBlock()
if (!(stat == currTopLevelStat && isThenOrExpectOrFilterBlock()
|| currSpecialMethodCall.isConditionMethodCall()
|| currSpecialMethodCall.isConditionBlock()
|| currSpecialMethodCall.isGroupConditionBlock()
Expand Down Expand Up @@ -339,8 +339,8 @@ private ClosureExpression getCurrentWithOrMockClosure() {
return null;
}

private boolean isThenOrExpectBlock() {
return (block instanceof ThenBlock || block instanceof ExpectBlock);
private boolean isThenOrExpectOrFilterBlock() {
return (block instanceof ThenBlock || block instanceof ExpectBlock || block instanceof FilterBlock);
}

private boolean isInteractionExpression(InteractionRewriter rewriter, ExpressionStatement stat) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -237,4 +237,8 @@ public void visitCleanupBlock(CleanupBlock block) throws Exception {
public void visitWhereBlock(WhereBlock block) throws Exception {
addBlockMetadata(block, BlockKind.WHERE);
}

public void visitFilterBlock(FilterBlock block) throws Exception {
addBlockMetadata(block, BlockKind.FILTER);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -368,12 +368,22 @@ private ClassNode getPlainReference(ClassNode type) {
// s.t. missing method parameters are added; these parameters
// will then be used by DeepBlockRewriter
private void handleWhereBlock(Method method) {
Block block = method.getLastBlock();
if (!(block instanceof WhereBlock)) return;
Block lastblock = method.getLastBlock();
FilterBlock filterBlock;
WhereBlock whereBlock;
if (lastblock instanceof FilterBlock) {
filterBlock = (FilterBlock) lastblock;
whereBlock = (WhereBlock) lastblock.getPrevious();
} else if (lastblock instanceof WhereBlock) {
filterBlock = null;
whereBlock = (WhereBlock) lastblock;
} else {
return;
}

DeepBlockRewriter deep = new DeepBlockRewriter(this);
deep.visit(block);
WhereBlockRewriter.rewrite((WhereBlock) block, this, deep.isDeepNonGroupedConditionFound());
deep.visit(whereBlock);
WhereBlockRewriter.rewrite(whereBlock, filterBlock, this, deep.isDeepNonGroupedConditionFound());
}

@Override
Expand Down Expand Up @@ -501,7 +511,7 @@ public void visitCleanupBlock(CleanupBlock block) {

method.getStatements().add(tryFinally);

// a cleanup-block may only be followed by a where-block, whose
// a cleanup-block may only be followed by a where-block and filter-block, whose
// statements are copied to newly generated methods rather than
// the original method
movedStatsBackToMethod = true;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@

package org.spockframework.compiler;

import org.spockframework.compiler.model.FilterBlock;
import org.spockframework.compiler.model.WhereBlock;
import org.spockframework.runtime.model.DataProcessorMetadata;
import org.spockframework.runtime.model.DataProviderMetadata;
Expand Down Expand Up @@ -46,6 +47,7 @@
*/
public class WhereBlockRewriter {
private final WhereBlock whereBlock;
private final FilterBlock filterBlock;
private final IRewriteResources resources;
private final boolean defineErrorRethrower;
private final InstanceFieldAccessChecker instanceFieldAccessChecker;
Expand All @@ -64,16 +66,17 @@ public class WhereBlockRewriter {
private final List<Expression> dataVariableMultiplications = new ArrayList<>();
private int localVariableCount = 0;

private WhereBlockRewriter(WhereBlock whereBlock, IRewriteResources resources, boolean defineErrorRethrower) {
private WhereBlockRewriter(WhereBlock whereBlock, FilterBlock filterBlock, IRewriteResources resources, boolean defineErrorRethrower) {
this.whereBlock = whereBlock;
this.filterBlock = filterBlock;
this.resources = resources;
this.defineErrorRethrower = defineErrorRethrower;
instanceFieldAccessChecker = new InstanceFieldAccessChecker(resources);
errorRethrowerUsageDetector = defineErrorRethrower ? new ErrorRethrowerUsageDetector() : null;
}

public static void rewrite(WhereBlock block, IRewriteResources resources, boolean defineErrorRethrower) {
new WhereBlockRewriter(block, resources, defineErrorRethrower).rewrite();
public static void rewrite(WhereBlock block, FilterBlock filterBlock, IRewriteResources resources, boolean defineErrorRethrower) {
new WhereBlockRewriter(block, filterBlock, resources, defineErrorRethrower).rewrite();
}

private void rewrite() {
Expand Down Expand Up @@ -144,6 +147,7 @@ private void rewrite() {
handleFeatureParameters();
createDataProcessorMethod();
createDataVariableMultiplicationsMethod();
createFilterMethod();
}

private static boolean isMultiplicand(Statement stat) {
Expand Down Expand Up @@ -879,6 +883,40 @@ private void createDataVariableMultiplicationsMethod() {
whereBlock.getParent().getParent().getAst().addMethod(dataVariableMultiplicationsMethod);
}

private void createFilterMethod() {
if (dataProcessorVars.isEmpty() || (filterBlock == null)) return;

DeepBlockRewriter deep = new DeepBlockRewriter(resources);
deep.visit(filterBlock);

List<Statement> filterStats = new ArrayList<>(filterBlock.getAst());
filterBlock.getAst().clear();

instanceFieldAccessChecker.check(filterStats);

if (deep.isConditionFound()) {
resources.defineValueRecorder(filterStats, "");
}
if (deep.isDeepNonGroupedConditionFound()) {
resources.defineErrorRethrower(filterStats);
}

BlockStatement blockStat = new BlockStatement(filterStats, null);

MethodNode filterMethod = new MethodNode(
InternalIdentifiers.getFilterName(filterBlock.getParent().getAst().getName()),
Opcodes.ACC_PUBLIC | Opcodes.ACC_STATIC | Opcodes.ACC_SYNTHETIC,
ClassHelper.VOID_TYPE,
dataProcessorVars
.stream()
.map(variable -> new Parameter(ClassHelper.OBJECT_TYPE, variable.getName()))
.toArray(Parameter[]::new),
ClassNode.EMPTY_ARRAY,
blockStat);

filterBlock.getParent().getParent().getAst().addMethod(filterMethod);
}

private static InvalidSpecCompileException notAParameterization(ASTNode stat) {
return new InvalidSpecCompileException(stat,
"where-blocks may only contain parameterizations (e.g. 'salary << [1000, 5000, 9000]; salaryk = salary / 1000')");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ public Block addNewBlock(Method method) {
}
@Override
public EnumSet<BlockParseInfo> getSuccessors(Method method) {
return EnumSet.of(AND, COMBINED, METHOD_END);
return EnumSet.of(AND, COMBINED, FILTER, METHOD_END);
}
},

Expand All @@ -133,6 +133,17 @@ public EnumSet<BlockParseInfo> getSuccessors(Method method) {
}
},

FILTER {
@Override
public Block addNewBlock(Method method) {
return method.addBlock(new FilterBlock(method));
}
@Override
public EnumSet<BlockParseInfo> getSuccessors(Method method) {
return EnumSet.of(AND, METHOD_END);
}
},

METHOD_END {
@Override
public Block addNewBlock(Method method) {
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
/*
* Copyright 2024 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.spockframework.compiler.model;

/**
* AST node representing a filter-block in a feature method.
*/
public class FilterBlock extends Block {
public FilterBlock(Method parent) {
super(parent);
setName("filter");
}

@Override
public void accept(ISpecVisitor visitor) throws Exception {
visitor.visitAnyBlock(this);
visitor.visitFilterBlock(this);
}

@Override
public BlockParseInfo getParseInfo() {
return BlockParseInfo.FILTER;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ public interface ISpecVisitor {
void visitExpectBlock(ExpectBlock block) throws Exception;
void visitWhenBlock(WhenBlock block) throws Exception;
void visitThenBlock(ThenBlock block) throws Exception;
void visitFilterBlock(FilterBlock block) throws Exception;
void visitCleanupBlock(CleanupBlock block) throws Exception;
void visitWhereBlock(WhereBlock block) throws Exception;
}
Loading

0 comments on commit 1f3b694

Please sign in to comment.